Data Engineer

MediaLab is a media & technology company focused on acquiring and growing properties and global brands.
Data
Mid-Level Software Engineer
Remote
1+ year of experience
AI · Enterprise SaaS
This job posting may no longer be active. You may be interested in these related jobs instead:
Business Intelligence Engineer, Silk

Business Intelligence Engineer role at Amazon Silk team focusing on data modeling, analytics, and business insights using various data technologies.

Business Intelligence Engineer, Chronicle

Business Intelligence Engineer position at Amazon's Creative Intelligence team, focusing on data analysis, visualization, and insights generation using SQL, Python, and various BI tools.

Business Intelligence Engineer, People Analytics, Technology, and Operations Excellence (PATOE)

Business Intelligence Engineer role at Amazon focusing on people analytics and operations excellence, requiring 5+ years of experience in SQL, Python, and data analysis.

Data Engineer, Japan Consumer Innovation (JCI), Data Services & Technologies

Data Engineer position at Amazon Japan focusing on building and scaling data infrastructure for retail business using AWS services and ML solutions.

Business Intel Engineer, GTMO Business Intelligence

Business Intelligence Engineer role at Amazon Business, focusing on data analytics and reporting for CPS Sales teams, requiring 3+ years of data analysis experience.

Description For Data Engineer

MediaLab is a media & technology company focused on acquiring and growing properties and global brands. The organization is a unique combination of private equity, holding company and operating entity. They continue to expand and are proud of the prominent market position of their brands.

As a Data Engineer at MediaLab, you will be a key member of the team that builds and maintains data platform solutions. The Data team works with all of MediaLab brands (Genius, Imgur, Kik, Worldstar Hip-Hop, Amino, Whisper) as well as business teams across the organization. They own a streaming pipeline which handles thousands of events per second and billions per day; a BigQuery instance comprising dozens of schemas, thousands of tables, and over a petabyte of storage; and hundreds of DAGs in Airflow and Dataform.

Your responsibilities will include:

  • Maintaining and optimizing go-based pub/sub consumers handling complex transformation logic
  • Improving the efficiency of multiple repositories dedicated to ingesting and parsing data from a wide variety of sources
  • Helping improve monitoring efforts and troubleshooting problems with ETL jobs and event pipelines
  • Supporting requests from business teams to find and ingest new data sources
  • Working with engineering teams on all brands to integrate their application data to the data warehouse/data lake
  • Contributing to setting up and adjusting deployment workflows

The ideal candidate should have:

  • 1-2 years of professional experience as a data engineer or software engineer with significant exposure to working with data
  • Comfort with Python, with Golang experience being a big plus
  • Experience with at least one major cloud provider (GCP and/or AWS preferred)
  • Experience using containers; Kubernetes experience is preferred
  • Very solid SQL skills and understanding of efficient complex queries
  • Ability to write clear documentation
  • Experience with CI/CD platforms, Terraform or infrastructure as code
  • Experience with major data warehouses and pub/sub tools is a plus

Join an exceptionally talented team of engineers, designers, product and business builders in a company that values diversity and inclusivity.

Last updated 6 months ago

Responsibilities For Data Engineer

  • Maintain and optimize go-based pub/sub consumers
  • Improve efficiency of data ingestion and parsing repositories
  • Improve monitoring efforts and troubleshoot ETL jobs and event pipelines
  • Support requests for new data source ingestion
  • Work with engineering teams to integrate application data to data warehouse/lake
  • Contribute to deployment workflow setup and adjustments

Requirements For Data Engineer

Python
Go
Kubernetes
  • 1-2 years of professional experience as a data engineer or software engineer with data exposure
  • Comfortable with Python
  • Experience with major cloud provider (GCP and/or AWS preferred)
  • Experience using containers
  • Very solid SQL skills
  • Ability to write clear documentation
  • Experience with CI/CD platform
  • Experience with Terraform or infra as code

Interested in this job?