Google's Content Safety Platform (CSP) team is seeking a Data Scientist to join their User Protection organization. This role is crucial in developing and improving AI-powered tools that protect users from online abuse and harmful content across Google's products. The position offers an opportunity to work within a large data science team in Core, providing extensive knowledge sharing and development opportunities.
The role involves working directly with product and engineering teams to evaluate, understand, and enhance the quality of user protections. You'll be responsible for developing evaluation methodologies for content safety classifiers and contributing to generic strategies for understanding these systems. The work is particularly important as product safety is fundamental to Google's success, especially in their novel AI tools.
As a Data Scientist, you'll collaborate with cross-functional teams to translate business objectives into actionable analyses and metrics. The role requires expertise in Python programming, machine learning, and statistical analysis. You'll work with custom data infrastructure, design mathematical models, and ensure data quality across various sources.
The position offers a competitive compensation package, including a base salary range of $118,000-$170,000, plus bonus, equity, and comprehensive benefits. Google provides an inclusive work environment, emphasizing equal opportunity and a culture of belonging. The role is based in San Francisco, CA, and requires English proficiency for effective global collaboration.
This is an excellent opportunity for candidates with a strong quantitative background who are passionate about using data science to improve user safety and protection at scale. The role combines technical expertise with business impact, offering the chance to work on critical safety features that affect millions of users worldwide.