Data Engineer, Analytics, Core Infrastructure, Trust and Safety

Dublin, Ireland

Google

Google’s mission is to organize the world's information and make it universally accessible and useful.

View all jobs at Google

Apply now Apply later

Minimum qualifications:

  • Bachelor's degree or equivalent practical experience.
  • 3 years of experience coding in one or more programming languages.
  • 3 years of experience working with data infrastructure and data models by performing exploratory queries and scripts.
  • 3 years of experience designing data pipelines, and dimensional data modeling for synch and asynch system integration and implementation using internal (e.g., Flume, etc.) and external stacks (DataFlow, Spark, etc.).

Preferred qualifications:

  • 3 years of experience with statistical methodology and data consumption tools such as business intelligence tools, collabs, jupyter notebooks, Tableau, Power BI, DataStudio, and business intelligence platforms.
  • 3 years of experience partnering with stakeholders (e.g., users, partners, customer), and managing stakeholders/customers.
  • 3 years of experience developing project plans and delivering projects on time within budget and scope.
  • Experience with Machine Learning for production workflows.

About the job

The Core team builds the technical foundation behind Google’s flagship products. We are owners and advocates for the underlying design elements, developer platforms, product components, and infrastructure at Google. These are the essential building blocks for excellent, safe, and coherent experiences for our users and drive the pace of innovation for every developer. We look across Google’s products to build central solutions, break down technical barriers and strengthen existing systems. As the Core team, we have a mandate and a unique opportunity to impact important technical decisions across the company.

Responsibilities

  • Design, build, optimize, and maintain batch and real-time pipelines to ingest, process, and transform data sources.
  • Develop and manage logical and physical data models within data warehouses or data lakes, ensuring data is structured efficiently for analytics, reporting, machine learning model training, and operational use.
  • Implement robust monitoring, alerting, and data quality checks to ensure the accuracy, completeness, and availability of critical datasets. Troubleshoot and resolve data-related issues promptly.
  • Collaborate with platform engineers to optimize data storage, retrieval, and processing performance, ensuring systems can scale effectively with growing data volumes and query complexity.
  • Work closely with data scientists, analysts, product managers, and policy specialists to deliver effective data solutions. Provide support for data access and tooling. Implement and adhere to data security best practices and privacy regulations when managing sensitive user data.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Business Intelligence Dataflow Data pipelines Data quality Jupyter Machine Learning Model training Pipelines Power BI Privacy Security Spark Statistics Tableau

Perks/benefits: Career development

Region: Europe
Country: Ireland

More jobs like this