Consultant GCP Data Engineer

Bangalore, Karnataka, India

KPMG India

Welcome to KPMG International.

View all jobs at KPMG India

Apply now Apply later

Job Requirements:-

• Bachelors or higher degree in Computer Science or a related discipline; or equivalent (minimum 5 years work experience).

 • At least 2+ years of consulting or client service delivery experience on Google Cloud Platform (GCP). 

• At least 2+ years of experience in developing data ingestion, data processing, and analytical pipelines for big data, relational databases such as Cloud SQL, and data warehouse solutions such as BigQuery. 

• Extensive experience providing practical direction with using GCP Native services. 

• Extensive hands-on experience implementing data ingestion, ETL, and data processing using GCP services: Google Cloud Storage (GCS), Dataflow, Cloud Functions, Cloud Composer, BigQuery, Cloud SQL, Pub/Sub, IoT Core, Dataproc, Dataprep, Bigtable, Firestore, etc. 

• Minimum of 2+ years of hands-on experience in GCP and Big Data technologies such as Java, Python, SQL, GCS, Apache Beam, PySpark, and SparkSQL, Dataproc, and live streaming technologies such as Pub/Sub, Dataflow, etc. 

• Well versed in DevSecOps and CI/CD deployments. 

• Cloud migration methodologies and processes including tools like Cloud Dataflow and Database Migration Service. 

• Minimum of 2+ years of RDBMS experience. 

• Experience in using Big Data File Formats and compression techniques. 

• Experience working with Developer tools such as Cloud Build, IntelliJ IDEA, Git, Jenkins, etc. 

• Experience with private and public cloud architectures, pros/cons, and migration considerations.

Primary Roles and Responsibilities:-

A Google Cloud Data Engineer is responsible for designing, building, and maintaining the data infrastructure for an organization using Google Cloud Platform (GCP) services. This includes creating data pipelines, integrating data from various sources, and implementing data security and privacy measures. The Google Cloud Data Engineer will also be responsible for monitoring and troubleshooting data flows and optimizing data storage and processing for performance and cost efficiency.

  Preferred Skills:-

       • DevOps on a Google Cloud Platform (GCP). 

     • Experience developing and deploying ETL solutions on Google Cloud. 

     • Familiarity with Google Cloud Data Catalog for metadata management, Data Governance, Data     Lineage, Data Catalog, etc.

    • Knowledge of Google Cloud IAM (Identity and Access Management). Understanding access controls     and security on Google Cloud. 

• Inclined with Google's vision and roadmap around the latest tools and technologies in the market. 

• Knowledge on Google Cloud's Vertex AI, Machine Learning capabilities, and industry supporting use cases. 

• Google Cloud certifications role-based (e.g., Professional Data Engineer, Professional Cloud Architect, Associate Cloud Engineer, Professional Machine Learning Engineer, etc.) 

• Google Looker Studio (formerly Google Data Studio) hands-on and working knowledge, to create reports/dashboards and generate insights for business users.

Desired Skills and Experience (Must Have):
Proficient Experience on designing, building and operationalizing large-scale enterprise data solutions using at least four GCP services among Data Flow, Data Proc, Pub Sub, BigQuery, Cloud Functions, Composer, GCS, Big Table, Data Fusion, Firestore
Proficient hands-on programming experience in SQL, Python and Spark/PySpark
Proficient in building production level ETL/ELT data pipelines from data ingestion to consumption
Data Engineering knowledge (such as Data Lake, Data warehouse, Integration, Migration) 
Excellent communicator (written and verbal formal and informal)
Experience using software version control tools (Git/Bitbucket/code commit).
Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
Ability to multi-task under pressure and work independently with minimal supervision.
Must be a team player and enjoy working in a cooperative and collaborative team environment.
Desired Skills and Experience (Good to Have):
GCP Certification preferred.
Additional Cloud experience in AWS or Azure preferred.
Qualification
Completed undergraduate degree with outstanding academic credentials (preferably a technical undergrad degree e.g. Computer Science, Engineering)

Job Requirements:-

• Bachelors or higher degree in Computer Science or a related discipline; or equivalent (minimum 5 years work experience).

 • At least 2+ years of consulting or client service delivery experience on Google Cloud Platform (GCP). 

• At least 2+ years of experience in developing data ingestion, data processing, and analytical pipelines for big data, relational databases such as Cloud SQL, and data warehouse solutions such as BigQuery. 

• Extensive experience providing practical direction with using GCP Native services. 

• Extensive hands-on experience implementing data ingestion, ETL, and data processing using GCP services: Google Cloud Storage (GCS), Dataflow, Cloud Functions, Cloud Composer, BigQuery, Cloud SQL, Pub/Sub, IoT Core, Dataproc, Dataprep, Bigtable, Firestore, etc. 

• Minimum of 2+ years of hands-on experience in GCP and Big Data technologies such as Java, Python, SQL, GCS, Apache Beam, PySpark, and SparkSQL, Dataproc, and live streaming technologies such as Pub/Sub, Dataflow, etc. 

• Well versed in DevSecOps and CI/CD deployments. 

• Cloud migration methodologies and processes including tools like Cloud Dataflow and Database Migration Service. 

• Minimum of 2+ years of RDBMS experience. 

• Experience in using Big Data File Formats and compression techniques. 

• Experience working with Developer tools such as Cloud Build, IntelliJ IDEA, Git, Jenkins, etc. 

• Experience with private and public cloud architectures, pros/cons, and migration considerations.

Primary Roles and Responsibilities:-

A Google Cloud Data Engineer is responsible for designing, building, and maintaining the data infrastructure for an organization using Google Cloud Platform (GCP) services. This includes creating data pipelines, integrating data from various sources, and implementing data security and privacy measures. The Google Cloud Data Engineer will also be responsible for monitoring and troubleshooting data flows and optimizing data storage and processing for performance and cost efficiency.

  Preferred Skills:-

       • DevOps on a Google Cloud Platform (GCP). 

     • Experience developing and deploying ETL solutions on Google Cloud. 

     • Familiarity with Google Cloud Data Catalog for metadata management, Data Governance, Data     Lineage, Data Catalog, etc.

    • Knowledge of Google Cloud IAM (Identity and Access Management). Understanding access controls     and security on Google Cloud. 

• Inclined with Google's vision and roadmap around the latest tools and technologies in the market. 

• Knowledge on Google Cloud's Vertex AI, Machine Learning capabilities, and industry supporting use cases. 

• Google Cloud certifications role-based (e.g., Professional Data Engineer, Professional Cloud Architect, Associate Cloud Engineer, Professional Machine Learning Engineer, etc.) 

• Google Looker Studio (formerly Google Data Studio) hands-on and working knowledge, to create reports/dashboards and generate insights for business users.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Architecture AWS Azure Big Data BigQuery Bigtable Bitbucket CI/CD Computer Science Consulting Dataflow Data governance Data pipelines Dataproc Data Studio Data warehouse DevOps ELT Engineering ETL GCP Git Google Cloud Java Jenkins Looker Machine Learning Pipelines Privacy PySpark Python RDBMS Security Spark SQL Streaming Vertex AI

Perks/benefits: Flex hours

Region: Asia/Pacific
Country: India

More jobs like this