Taro Logo

Senior Data Engineer (GCP/Databricks)

Leadtech is a technology company focused on delivering exceptional user experiences through advanced data solutions.
Data
Senior Software Engineer
Remote
3+ years of experience
Enterprise SaaS

Job Description

Leadtech is seeking a Senior Data Engineer to spearhead their data infrastructure development on Google Cloud Platform (GCP). This role combines cutting-edge cloud technologies with data engineering expertise, focusing on building scalable data pipelines using Databricks, BigQuery, and various GCP services. The position offers a blend of technical challenges and architectural responsibilities, from implementing ETL/ELT processes to maintaining data quality standards. With a flexible remote work policy and comprehensive benefits package, this role presents an opportunity to join a company that values work-life balance and professional growth. The ideal candidate will bring 3+ years of data engineering experience and strong technical skills in cloud computing, while contributing to a team dedicated to delivering exceptional user experiences. The role offers competitive compensation, professional development opportunities, and a collaborative work environment with the option to work remotely or from their Barcelona office.

Last updated 18 days ago

Responsibilities For Senior Data Engineer (GCP/Databricks)

  • Define and implement overall data architecture on GCP, including data warehousing in BigQuery
  • Design, build, and optimize ETL/ELT pipelines using Apache Airflow
  • Implement dbt transformations to maintain version-controlled data models
  • Implement event-driven or asynchronous data workflows between microservices
  • Enforce data quality standards using Great Expectations
  • Integrate with Looker for data visualization and insights
  • Maintain Data Mart environments for specific business domains

Requirements For Senior Data Engineer (GCP/Databricks)

Python
Java
Kafka
MongoDB
  • 3+ years of professional experience in data engineering, with 1 year in mobile data
  • Experience with BigQuery and Google Cloud Storage-based data lakes
  • Deep knowledge of Apache Airflow and ETL/ELT design
  • Strong coding capabilities in Python, Java, or Scala
  • Experience with Docker and Kubernetes
  • Hands-on with CI/CD pipelines and DevOps tools
  • Proficiency in Great Expectations
  • Expertise in data lineage, metadata management, and compliance
  • Strong understanding of OLTP and OLAP systems
  • Excellent communication skills

Benefits For Senior Data Engineer (GCP/Databricks)

Medical Insurance
Dental Insurance
Mental Health Assistance
Education Budget
  • Flexible career path with personalized internal training
  • Annual budget for external learning
  • Flexible schedule with flextime
  • Full remote or office option
  • Free Friday afternoons
  • 35-hour workweek in July and August
  • Competitive salary
  • Private health insurance including dental and psychological services
  • 25 days vacation plus birthday off
  • Office perks: free coffee, fresh fruit, snacks, game room, rooftop terrace
  • Ticket restaurant and nursery vouchers