Taro Logo

Data Engineer

66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology.
Data
Senior Software Engineer
5+ years of experience
AI · Enterprise SaaS
This job posting may no longer be active. You may be interested in these related jobs instead:

Description For Data Engineer

66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. We help the world's leading brands transform their business challenges into opportunities and shape the future of work.

As a Cloud Data Engineer at 66degrees, you'll have the opportunity to grow and apply your expertise with leading-edge cloud customers, solving a variety of challenging technology problems spanning different industries, use-cases, and tech stacks. Your work will make a huge impact on some of the largest and most advanced cloud customers out there.

Responsibilities include:

  • Lead data management and engineering projects primarily within Google Cloud, but also on AWS and Azure environments.
  • Migrate, analyze, and manage data structures, as well as design, develop, test, and implement data solutions.
  • Work with a global team and customer-base to ensure continuous high-performance operations for mission-critical data systems.
  • Deliver, optimize, and support complex data solutions for customers.
  • Collaborate with technical and business stakeholders to advise customers and translate business requirements into effective solutions.
  • Ensure data quality by implementing data quality rules and test cases.

Qualifications:

  • 5+ years of experience in data engineering roles
  • 2+ years of experience working within GCP
  • Deep experience with relational and non-relational databases
  • Strong experience with Python and Git
  • Strong DBT experience
  • Experience with GCP BigQuery and Cloud SQL required
  • Strong Fivetran experience preferred
  • Apache stack experience preferred
  • Experience with Snowflake and Databricks preferred
  • Infrastructure as Code experience
  • Bachelor's degree in Computer Science, Computer Engineering, or related field

66degrees offers a culture that sparks innovation and supports professional and personal growth. Join us to make a significant impact in the world of cloud technology and data engineering.

Last updated 8 months ago

Responsibilities For Data Engineer

  • Lead data management and engineering projects primarily within Google Cloud, but also on AWS and Azure environments
  • Migrating, analyzing, and managing data structures, as well as designing, developing, testing, and implementing data solutions
  • Work with a global team and customer-base to ensure continuous high-performance operations for mission-critical data systems including DAGs, ETL, Airflow pipelines, EDWs, and datalakes
  • Deliver, optimize, and support complex data solutions for customers, including, ingestion, storage, pipelines, analysis, and visualization
  • Collaborate with technical and business stakeholders to advise customers and translate business requirements into effective solutions and implementations
  • Partner with teammates and customer data scientists, software engineers, and other stakeholders to support data acquisition, solution design, implementation, and ongoing maintenance
  • Ensure data quality by implementing data quality rules and test cases
  • Demonstrate expertise in data modeling, database design, cloud architecture, and data infrastructure management

Requirements For Data Engineer

Python
MongoDB
PostgreSQL
  • 5+ years of experience in data engineering roles
  • 2+ years of experience working within GCP
  • Deep experience with relational and non-relational databases including SQL, Mongo, and legacy databases
  • Strong experience with Python and Git
  • Strong DBT experience
  • Experience with GCP BigQuery and Cloud SQL
  • Proven ability to develop logical data models, ETL/ELT processes, and related documentation
  • Mastery of data modeling concepts, large-scale database implementations, and design patterns
  • Experience working with structured, semi-structured, and unstructured data sources
  • Strong analytical, problem-solving, and troubleshooting skills
  • Excellent written and oral communication skills
  • Infrastructure as Code experience (Terraform, Ansible, etc.)
  • Bachelor's degree in Computer Science, Computer Engineering, or related or equivalent work experience

Interested in this job?