Taro Logo

Staff Data Engineer

The Walt Disney Company is a leading diversified international family entertainment and media enterprise that includes three core business segments: Disney Entertainment, ESPN, and Disney Experiences.
Glendale, CA, USA
$149,300 - $200,200
Data
Staff Software Engineer
In-Person
5,000+ Employees
7+ years of experience
Entertainment · Media
This job posting is no longer active. Check out these related jobs instead:

Job Description

As a Staff Data Engineer at The Walt Disney Studios, you will play a pivotal role in the transformation of data into actionable insights. Collaborate with our dynamic team of technologists to develop cutting-edge data solutions that drive innovation and fuel business growth. Your responsibilities will include managing complex data structures and delivering scalable and efficient data solutions. Your expertise in data engineering will be crucial in optimizing our data-driven decision-making processes.

Key Responsibilities: • Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines • Build tools and services to support data discovery, lineage, governance, and privacy • Collaborate with other software/data engineers and cross-functional teams • Build and maintain continuous integration and deployment pipelines • Provision and support cloud resources • Tech stack includes Airflow, Spark, Snowflake, Databricks, and dbt • Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform • Contribute to developing and documenting both internal and external standards and best practices • Ensure high operational efficiency and quality of the Core Data platform datasets • Be an active participant and advocate of agile/scrum ceremonies • Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements • Maintain detailed documentation of your work and changes to support data quality and data governance requirements

Qualifications: • 7+ years of data engineering experience developing large data pipelines • Proficient in SQL Engines with advanced performance tuning capabilities • Strong understanding of data modeling principles, including Dimensional Modeling and data normalization • Proficiency in at least one major programming language (e.g. Python, Java, Scala) • Hands-on production experience with data pipeline orchestration systems such as Airflow • Experience with Snowflake Platform & familiarity with Databricks is a plus • Experience designing, developing, and optimizing scalable data pipelines and ETL processes • Experience implementing data quality checks, monitoring, and logging • Experience designing & developing world class CI/CD and DevOps practices • Proficiency of Terraform or CDKTF • Proficiency with containers Docker, Kubernetes etc. • Deep Understanding of AWS or other cloud providers as well as infrastructure as code • Excellent conceptual and analytical reasoning competencies • Advance understanding of OLTP vs OLAP environments • Willingness and ability to learn and pick up new skill sets • Self-starting problem solver with an eye for detail and excellent analytical and communication skills • Familiar with Scrum and Agile methodologies

Required Education: • Bachelor's degree in Computer Science, Information Systems, or a related field, or equivalent work experience • Master's Degree is a plus

The hiring range for this position in California is $149,300 - $200,200 per year based on a 40 hour work week. The amount of hours scheduled per week may vary based on business needs. The base pay actually offered will take into account internal equity and also may vary depending on the candidate's geographic region, job-related knowledge, skills, and experience among other factors. A bonus and/or long-term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered.

Last updated a year ago

Responsibilities For Staff Data Engineer

  • Maintain and expand Core Data platform data pipelines
  • Build tools for data discovery, lineage, governance, and privacy
  • Collaborate with cross-functional teams
  • Build and maintain CI/CD pipelines
  • Provision and support cloud resources
  • Develop and document data standards and best practices
  • Ensure high operational efficiency and quality of datasets
  • Participate in agile/scrum ceremonies
  • Engage with customers to understand and prioritize platform improvements
  • Maintain detailed documentation for data quality and governance

Requirements For Staff Data Engineer

Python
Java
Scala
Kubernetes
  • 7+ years of data engineering experience
  • Proficiency in SQL and data modeling
  • Experience with major programming languages (Python, Java, Scala)
  • Hands-on experience with data pipeline orchestration (e.g., Airflow)
  • Experience with Snowflake and Databricks
  • Expertise in designing and optimizing ETL processes
  • Proficiency in CI/CD and DevOps practices
  • Knowledge of cloud providers (e.g., AWS) and infrastructure as code
  • Understanding of OLTP vs OLAP environments
  • Familiarity with Scrum and Agile methodologies
  • Bachelor's degree in Computer Science or related field

Benefits For Staff Data Engineer

Medical Insurance
401k
Equity
  • Health insurance & wellbeing
  • Childcare options
  • Paid time off
  • Retirement programs
  • Tuition assistance
  • Weekly pay