DE&A - Core - GCP Senior Engineer
India
Zensar
Zensar is a global organization which conceptualizes, builds, and manages digital products through experience design, data engineering, and advanced analytics for over 200 leading companies. Our solutions leverage industry-leading platforms to...
Job Summary:
We are seeking a skilled and proactive Google Cloud Platform (GCP) Solution Architect - GCP to design, build, and maintain scalable data pipelines and infrastructure in a cloud-native environment.
The ideal candidate will have hands-on experience with GCP services, data ingestion, transformation, and storage, and will be responsible for enabling applied data science knowledge to support advanced analytics and data-driven decision-making across the organization.
Key Responsibilities:
Design, develop, and maintain scalable and reliable data pipelines on Google Cloud Platform (GCP).
Build ETL/ELT processes using Cloud Dataflow (Apache Beam), Cloud Composer (Airflow), Cloud Functions, and other GCP-native tools.
Ingest and process structured and unstructured data from various sources like Cloud Storage, Pub/Sub, BigQuery, Cloud SQL, and external APIs.
Leverage Python (Numpy, Pandas etc) to perform data wrangling, feature engineering and model integration into data pipelines.
Model data using BigQuery, optimize for performance and cost.
Work closely with data scientists to operationalize ML models and automate data science workflows.
Ensure data quality, governance, and lineage using tools like Data Catalog and Data Loss Prevention (DLP).
Monitor pipeline health, implement logging and alerting with Stackdriver (Cloud Monitoring & Logging).
Automate workflows and deployments using Terraform, CI/CD pipelines (e.g., Cloud Build).
Required Skills & Qualifications:
6-8 years of experience as a Data Engineer with at least 4+ years on GCP.
Strong experience with BigQuery, Cloud Storage, Cloud Pub/Sub, Dataflow, Cloud SQL, and Cloud Composer.
Proficient in Python Data Science Stack: NumPy, Pandas.
Strong understanding of SQL, data warehousing, and data modeling concepts.
Experience with Apache Beam, Airflow, or similar orchestration frameworks.
Familiarity with Terraform, Git, and CI/CD practices.
Experience with data security, IAM roles, and cost optimization in GCP.
Familiarity with data visualization tools and techniques(e.g. PowerBI and Looker)
Preferred Qualifications:
GCP Certification such as Professional Data Engineer.
Experience with Kafka, Dataproc, Spark, or Hadoop (optional depending on use case).
Background in analytics, machine learning, or real-time data processing is a plus.
Understanding of data governance and compliance (GDPR, HIPAA, etc.).
Soft Skills:
Strong problem-solving and analytical skills.
Excellent communication and stakeholder management.
Team player with the ability to work in a fast-paced environment
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs BigQuery CI/CD Dataflow Data governance Data pipelines Dataproc Data quality Data visualization Data Warehousing ELT Engineering ETL Feature engineering GCP Git Google Cloud Hadoop Kafka Looker Machine Learning ML models NumPy Pandas Pipelines Power BI Python Security Spark SQL Terraform Unstructured data
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.