Egen is seeking a skilled Data Engineer with expertise in Python and Google Cloud Platform (GCP) to join their data engineering team. This role focuses on building and maintaining ETL data pipelines using various GCP services including Dataflow, BigQuery, Cloud Functions, and Cloud Composer. The ideal candidate will have 4-6 years of experience and will be responsible for data ingestion, transformation, and ensuring data quality across systems.
The position requires strong technical skills in Python programming, GCP cloud services, and SQL databases. You'll be working with modern data engineering tools and practices, including version control with GitHub and CI/CD pipelines. The role involves collaboration with data scientists and analysts to deliver efficient data solutions.
Key responsibilities include designing scalable ETL pipelines, implementing data quality checks, and working with various GCP services. The role offers exposure to cutting-edge cloud technologies and the opportunity to work on complex data engineering challenges. Optional skills that would be valuable include experience with Snowflake, Databricks, and Azure Data Factory.
This hybrid position is based in Hyderabad and offers the opportunity to work with a dynamic team while building and maintaining critical data infrastructure. The role combines technical expertise with collaborative teamwork, making it ideal for data engineers looking to work with modern cloud technologies and complex data systems.