Dun & Bradstreet, a global leader in data analytics with over 6,000 team members worldwide, is seeking a Principal Engineer to join their technology team in Hyderabad, India. This role combines big data engineering with cloud infrastructure expertise, focusing on building and maintaining scalable data pipelines using cutting-edge technologies like Apache Spark and Airflow.
The position requires a seasoned professional with 10+ years of experience in big data technologies, who will be responsible for architecting and implementing distributed data processing solutions. You'll work with cloud platforms (particularly GCP and AWS), manage complex data pipelines, and optimize resource allocation across multiple tenants.
This is an excellent opportunity for an experienced engineer who is passionate about data architecture and wants to work with a company that values innovation and continuous learning. The role offers a hybrid work environment and the chance to work with a global team on solutions that help businesses worldwide make better decisions.
The ideal candidate will bring deep expertise in Python programming, cloud infrastructure management, and big data technologies, along with the ability to create detailed designs and proof-of-concepts for new technical capabilities. You'll be part of a diverse, global community that values being data-inspired, relentlessly curious, and inherently generous.
Working at Dun & Bradstreet means joining a company with a rich history in data analytics that continues to evolve and innovate. The company culture promotes creativity and growth, offering opportunities to work on challenging problems while contributing to solutions that help businesses worldwide transform uncertainty into confidence.