Taro Logo

AWS Data Lake Engineer

A global leader in payment processing technology, handling the world's largest volume of payments and driving the global economy daily.
Bengaluru, Karnataka, IndiaPune, Maharashtra, IndiaIndore, Madhya Pradesh, India
Data
Senior Software Engineer
In-Person
5000+ Employees
5+ years of experience
Finance

Job Description

Worldpay, a global leader in payment processing, is seeking a Senior AWS Data Lake Engineer to join their Big Data Team. This role combines technical expertise in AWS services and data engineering with a focus on handling sensitive payment data. The position offers an opportunity to work with cutting-edge technologies like Python, AWS Data Lake solutions, Databricks, and Snowflake while ensuring compliance with industry standards.

The role involves designing and implementing scalable data pipelines, managing AWS Data Lake solutions, and applying advanced data security measures. You'll be working in a dynamic environment across global teams, contributing to the company's mission of processing the world's largest volume of payments securely and efficiently.

The ideal candidate will bring 5+ years of Python development experience, strong AWS expertise, and a deep understanding of data lake architectures. You'll be based in one of Worldpay's modern hubs in India (Bangalore, Pune, or Indore), working in a collaborative environment that values curiosity, creativity, and determination.

Worldpay offers comprehensive benefits including competitive compensation, parental leave, charitable giving opportunities, and global recognition programs. The company culture emphasizes thinking like a customer, acting like an owner, and winning as a team, making it an ideal environment for professionals who want to make a significant impact in the global payments industry.

Last updated 6 days ago

Responsibilities For AWS Data Lake Engineer

  • Design, develop, and maintain scalable data pipelines using Python and AWS services
  • Implement and manage AWS Data Lake solutions, including ingestion, storage, and cataloging of structured and unstructured data
  • Apply data tokenization and masking techniques to protect sensitive information
  • Collaborate with data engineers, architects, and security teams
  • Optimize data workflows for performance, scalability, and cost-efficiency
  • Monitor and troubleshoot data pipeline issues
  • Document technical designs, processes, and best practices
  • Provide support on Databricks and Snowflake

Requirements For AWS Data Lake Engineer

Python
  • 5+ years of experience working as a Python developer/architect
  • Strong proficiency in Python, with experience in data processing libraries (e.g., Pandas, PySpark)
  • Proven experience with AWS services such as S3, Glue, Lake Formation, Lambda, Athena, and IAM
  • Solid understanding of data lake architecture and best practices
  • Experience with data tokenization, encryption, and anonymization techniques
  • Familiarity with data governance, compliance, and security standards
  • Experience with Snowflake and/or Databricks (Nice to have)
  • Experience with CI/CD tools and version control
  • Strong problem-solving skills and attention to detail

Benefits For AWS Data Lake Engineer

Parental Leave
  • Competitive salary and benefits
  • Time to support charities and give back to community
  • Parental leave policy
  • Global recognition platform
  • Virgin Pulse access
  • Global employee assistance program