Senior GCP Data Engineer (Technical Lead)
MANILA NET PARK OFFICE, Philippines
Procter & Gamble
Job Location
MANILA NET PARK OFFICEJob Description
Overview of the job
Join us as a Senior Data & Backend Engineer Developer and be the driving force behind the growth and success of our Consumer Insights Program. You will work with the architects, engineering and product leaders to develop world class data and backend platforms.
Your team
Reporting to the Engineering Director, Consumer Insights, you'll be part of a dynamic IT team at P&G. Our team leads multi-billion-dollar projects globally, leveraging cutting-edge technologies in mobile, social, cloud, big-data analytics, IoT, and more.
How success looks like
- Transform product requirements into high-quality software features.
- Lead the design, and development of data platform and backend applications in Google Cloud Platform.
- Mentor junior developers and keep yourself up to date with the latest technologies.
Responsibilities of the role
- API Development: Design, develop, and maintain robust and scalable backend systems and APIs.
- Data Ingestion: Develop and maintain data pipelines to extract data from various sources and load it into Google Cloud environments.
- Data Transformation: Implement data transformation processes, including data cleansing, normalization, and aggregation, to ensure data quality and consistency.
- Data Modeling: Develop and maintain data models and schemas to support efficient data storage and retrieval in Google Cloud platforms.
- Data Warehousing: Build data warehouses or data lakes using Google Cloud services such as Big Query.
- Data Integration: Integrate data from multiple sources, both on-premises and cloud-based, using Cloud Composer or other relevant tools.
- Data Governance: Implement data governance practices, including data security, privacy, and compliance, to ensure data integrity and regulatory compliance.
- Performance Optimization: Optimize data pipelines and queries for improved performance and scalability in Google Cloud environments.
- Monitoring and Troubleshooting: Monitor data pipelines, identify and resolve performance issues, and troubleshoot data-related problems in collaboration with other teams.
- Data Visualization: Build BI reports to enable faster decision making.
- Collaboration: Work with product managers to ensure superior product delivery to drive business value & transformation
- Documentation: Document data engineering processes, data flows, and system configurations for future reference and knowledge sharing.
Job Qualifications
Role Requirements
- Experience: Bachelor's or master's degree in computer science, data engineering, or a related field, along with 5+ year work experience in data engineering and cloud platforms. Previous experience in a senior or lead engineering role.
- Google Cloud Development: Strong proficiency in Google Cloud services such as Spanner, Cloud Composers, Looker Studio, etc. (Microsoft Azure Cloud knowledge will be a plus)
- ETL Tools: Experience with ETL (Extract, Transform, Load) tools and frameworks, such as Spark and Cloud Composer/Airflow for data integration and transformation.
- Programming: Proficiency in programming languages such as PySpark, Python, and SQL for data manipulation, scripting, and automation. Data Modeling: Knowledge of data modeling techniques and experience with data modeling tools.
- Database Technologies: Familiarity with relational databases (e.g., Cloud SQL) for data storage and retrieval.
- Data Warehousing: Understanding of data warehousing concepts, dimensional modeling, and experience with data warehousing technologies such as Big Query.
- Data Governance: Knowledge of data governance principles, data security, privacy regulations (e.g., GDPR, CCPA), and experience implementing data governance practices.
- Data Visualization: Experience of working with Looker Studio to build semantic data model & BI reports/dashboards.
- Cloud Computing: Familiarity with cloud computing concepts and experience working with cloud platforms, particularly Google Cloud Platform.
- Problem-Solving: Strong analytical and problem-solving skills to identify and resolve data-related issues. Proficiency in DevOps Tools and CICD tools (e.g. Terraform, Github) Familiarity with Azure, Databricks and its relevant tech stacks would be an advantage to the role.
About us
We produce globally recognized brands and we grow the best business leaders in the industry. With a portfolio of trusted brands as diverse as ours, it is paramount our leaders are able to lead with courage the vast array of brands, categories and functions. We serve consumers around the world with one of the strongest portfolios of trusted, quality, leadership brands, including Always®, Ariel®, Gillette®, Head & Shoulders®, Herbal Essences®, Oral-B®, Pampers®, Pantene®, Tampax® and more. Our community includes operations in approximately 70 countries worldwide.
Visit http://www.pg.com to know more.
We are an equal opportunity employer and value diversity at our company. We do not discriminate against individuals on the basis of race, color, gender, age, national origin, religion, sexual orientation, gender identity or expression, marital status, citizenship, disability, HIV/AIDS status, or any other legally protected factor.
Job Schedule
Full timeJob Number
R000120836Job Segmentation
Experienced Professionals (Job Segmentation)* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow API Development APIs Azure BigQuery Computer Science Data Analytics Databricks Data governance Data pipelines Data quality Data visualization Data Warehousing DevOps Engineering ETL GCP GitHub Google Cloud Looker Pipelines Privacy PySpark Python RDBMS Security Spark SQL Terraform
Perks/benefits: Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.