Data Engineer - Assistant Manager

Bangalore, Karnataka, India

KPMG India

Welcome to KPMG International.

View all jobs at KPMG India

Apply now Apply later

  • Bachelor’s or higher degree in Computer Science or a related discipline;Ā or equivalent (minimum 7 years work experience).Ā 
  • At least 4+ years of consulting or client service delivery experience onĀ Google Cloud Platform (GCP)
  • At least 4+ years of experience inĀ developing data ingestion, data processing, and analytical pipelines for big data, relational databases such as Cloud SQL, and data warehouse solutions such as BigQuery.
  • Extensive experience providing practical direction with using GCP Native services.
  • Extensive hands-on experienceĀ implementing data ingestion, ETL, and data processing using GCP services: Google Cloud Storage (GCS), Dataflow, Cloud Functions, Cloud Composer, BigQuery, Cloud SQL, Pub/Sub, IoT Core, Dataproc, Dataprep, Bigtable, Firestore, etc.
  • Minimum of 4+ years of hands-on experience inĀ GCP and Big Data technologies such as Java, Python, SQL, GCS, Apache Beam, PySpark, and SparkSQL, Dataproc, and live streaming technologies such as Pub/Sub, Dataflow etc.
  • Well versed in DevSecOps and CI/CD deployments
  • Cloud migration methodologies and processes including tools likeĀ Cloud Dataflow and Database Migration Service.
  • Minimum of 4+ years of RDBMS experience
  • Experience in using Big Data File Formats and compression techniques.
  • Experience working with Developer tools such asĀ such as Cloud Build, IntelliJ IDEA, Git, Jenkins, etc.
  • Experience with private and public cloud architectures, pros/cons, and migration considerations.
  • Experience with DBT and Databricks is a plus.
  • Bachelor’s or higher degree in Computer Science or a related discipline;Ā or equivalent (minimum 7 years work experience).Ā 
  • At least 4+ years of consulting or client service delivery experience onĀ Google Cloud Platform (GCP)
  • At least 4+ years of experience inĀ developing data ingestion, data processing, and analytical pipelines for big data, relational databases such as Cloud SQL, and data warehouse solutions such as BigQuery.
  • Extensive experience providing practical direction with using GCP Native services.
  • Extensive hands-on experienceĀ implementing data ingestion, ETL, and data processing using GCP services: Google Cloud Storage (GCS), Dataflow, Cloud Functions, Cloud Composer, BigQuery, Cloud SQL, Pub/Sub, IoT Core, Dataproc, Dataprep, Bigtable, Firestore, etc.
  • Minimum of 4+ years of hands-on experience inĀ GCP and Big Data technologies such as Java, Python, SQL, GCS, Apache Beam, PySpark, and SparkSQL, Dataproc, and live streaming technologies such as Pub/Sub, Dataflow etc.
  • Well versed in DevSecOps and CI/CD deployments
  • Cloud migration methodologies and processes including tools likeĀ Cloud Dataflow and Database Migration Service.
  • Minimum of 4+ years of RDBMS experience
  • Experience in using Big Data File Formats and compression techniques.
  • Experience working with Developer tools such asĀ such as Cloud Build, IntelliJ IDEA, Git, Jenkins, etc.
  • Experience with private and public cloud architectures, pros/cons, and migration considerations.
  • Experience with DBT and Databricks is a plus.

Completed undergraduate degree with outstanding academic credentials (preferably a technical undergrad degree e.g. Computer Science, Engineering)Relevant work experience 7+ years.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index šŸ’°

Job stats:  0  0  0

Tags: Architecture Big Data BigQuery Bigtable CI/CD Computer Science Consulting Databricks Dataflow Dataproc Data warehouse dbt Engineering ETL GCP Git Google Cloud Java Jenkins Pipelines PySpark Python RDBMS SQL Streaming

Region: Asia/Pacific
Country: India

More jobs like this