Assistant Manager GCP Data Engineer

Bangalore, Karnataka, India

KPMG India

Welcome to KPMG International.

View all jobs at KPMG India

Apply now Apply later

KPMG Global Services 

KPMG Global Services (KGS) was set up in India in 2008. It is a strategic global delivery organization, which works with more than 50 KPMG member firms to provide a progressive, scalable and customized approach to business requirements 

The KGS journey has been one of consistent growth, with a current employee count of nearly 10,000 operating from four locations in India — Bengaluru, Gurugram, Kochi and Pune, providing a range of Advisory and Tax-related services to member firms within the KPMG network. 

As part of KPMG in India, we were ranked among the top companies to work for in the country for four years in a row by LinkedIn, and recognized as one of the top three employers in the region for women, as well as for policies on Inclusion & Diversity by ASSOCHAM (The Associated Chambers of Commerce & Industry of India). 

  1. Furthermore, as KPMG in India, we were recognized as one of the ‘Best Companies for Millennials’ at The Millennial Max Conference 2019 presented by The LNOD Roundtable as well as ‘the Great Indian Workplace’ at the Culture Summit and Great Indian Workplace Awards 2019.

Team Overview 

KPMG’s network of Data & Analytics professionals recognizes that analytics has the power to create great value. That is why they take a business-first perspective, helping solve complex business challenges using analytics that clients can trust.

Data & Analytics professionals focus on solving complex business issues across all the key drivers of organizational value, including growth, risk and performance. And they work to deliver an unwavering commitment to precision and quality in everything they do.


Roles and Responsibilities

Designation   Assistant Manager Reporting to  Jagadish Doki Role type  Google Cloud Data Engineer Employment type  Full-time


 

Job Requirements

Mandatory Skills

 

  • Bachelor’s or higher degree in Computer Science or a related discipline; or equivalent (minimum 7 years work experience). 
  • At least 4+ years of consulting or client service delivery experience on Google Cloud Platform (GCP)
  • At least 4+ years of experience in developing data ingestion, data processing, and analytical pipelines for big data, relational databases such as Cloud SQL, and data warehouse solutions such as BigQuery.
  • Extensive experience providing practical direction with using GCP Native services.
  • Extensive hands-on experience implementing data ingestion, ETL, and data processing using GCP services: Google Cloud Storage (GCS), Dataflow, Cloud Functions, Cloud Composer, BigQuery, Cloud SQL, Pub/Sub, IoT Core, Dataproc, Dataprep, Bigtable, Firestore, etc.
  • Minimum of 4+ years of hands-on experience in GCP and Big Data technologies such as Java, Python, SQL, GCS, Apache Beam, PySpark, and SparkSQL, Dataproc, and live streaming technologies such as Pub/Sub, Dataflow etc.
  • Well versed in DevSecOps and CI/CD deployments
  • Cloud migration methodologies and processes including tools like Cloud Dataflow and Database Migration Service.
  • Minimum of 4+ years of RDBMS experience
  • Experience in using Big Data File Formats and compression techniques.
  • Experience working with Developer tools such as such as Cloud Build, IntelliJ IDEA, Git, Jenkins, etc.
  • Experience with private and public cloud architectures, pros/cons, and migration considerations.

 

Primary Roles and Responsibilities 

 

A Google Cloud Data Engineer is responsible for designing, building, and maintaining the data infrastructure for an organization using Google Cloud Platform (GCP) services. This includes creating data pipelines, integrating data from various sources, and implementing data security and privacy measures. The Google Cloud Data Engineer will also be responsible for monitoring and troubleshooting data flows and optimizing data storage and processing for performance and cost efficiency.

Preferred Skills

  • DevOps on Google Cloud Platform (GCP).
  • Experience developing and deploying ETL solutions on Google Cloud. 
  • Familiarity with Google Cloud Data Catalog for metadata management, Data Governance, Data     Lineage, Data Catalog, etc.
  • Knowledge of Google Cloud IAM (Identity and Access Management). Understanding access controls     and security on Google Cloud.
  • Inclined with Google's vision and roadmap around the latest tools and technologies in the market.
  • Knowledge on Google Cloud's Vertex AI, Machine Learning capabilities, and industry supporting use cases.
  • Google Cloud certifications role-based (e.g., Professional Data Engineer, Professional Cloud Architect, Associate Cloud Engineer, Professional Machine Learning Engineer, etc.) 
  • Google Looker Studio (formerly Google Data Studio) hands-on and working knowledge, to create reports/dashboards and generate insights for business users.

 

Other Information

Number of interview rounds 2 Mode of interview Virtual Job location Bangalore Clean room policy (specific to business) NA

Culture

  • Corporate Social Responsibility programs
  • Maternity and paternity leave
  • Opportunities to network and connect
  • Discounts on products and services

Note: Benefits/Perks listed above may vary depending on the nature of your employment with KPMG and the country where you work.

  1. © 2021 KPMG Global Services Private Limited, a company incorporated under the laws of India and a member firm of the KPMG global organization of independent member firms affiliated with KPMG International Limited (“KPMG International”), a private English company limited by guarantee. All rights reserved. The KPMG name and logo are trademarks used under license by the independent member firms of the KPMG global organization.
  2. The term “KGS” refers to the KGS Platform of Indian delivery entities, which consist of KPMG Global Services Private Limited (“KGSPL”), KPMG Global Services Management Private Limited (“KGSMPL”), KPMG Global Delivery Center Private Limited (“GDCPL”) and KPMG Resource Centre Private Limited (“KRCPL”), unless the specific private limited entity is specifically noted.
  3. Use in these materials of the term “Our”, or “Us” means KGSPL, KGSMPL, GDCPL and/or KRCPL as the case may be and all four such legal entities are referred to collectively as “KGS”.
  4. Note that Audit services for the clients of the US and Canadian firms are delivered from the GDCPL and for UK (and other firms except US and Canada) Audit services are delivered from KRCPL.

KPMG is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances.

 

 

 

 

 

                          

Desired Skills and Experience (Must Have):
Proficient Experience on designing, building and operationalizing large-scale enterprise data solutions using at least four GCP services among Data Flow, Data Proc, Pub Sub, BigQuery, Cloud Functions, Composer, GCS, Big Table, Data Fusion, Firestore
Proficient hands-on programming experience in SQL, Python and Spark/PySpark
Proficient in building production level ETL/ELT data pipelines from data ingestion to consumption
Data Engineering knowledge (such as Data Lake, Data warehouse, Integration, Migration) 
Excellent communicator (written and verbal formal and informal)
Experience using software version control tools (Git/Bitbucket/code commit).
Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
Ability to multi-task under pressure and work independently with minimal supervision.
Must be a team player and enjoy working in a cooperative and collaborative team environment.
Desired Skills and Experience (Good to Have):
GCP Certification preferred.
Additional Cloud experience in AWS or Azure preferred.
Qualification
Completed undergraduate degree with outstanding academic credentials (preferably a technical undergrad degree e.g. Computer Science, Engineering)

Job Requirements:-

• Bachelors or higher degree in Computer Science or a related discipline; or equivalent (minimum 5 years work experience).

 • At least 2+ years of consulting or client service delivery experience on Google Cloud Platform (GCP). 

• At least 2+ years of experience in developing data ingestion, data processing, and analytical pipelines for big data, relational databases such as Cloud SQL, and data warehouse solutions such as BigQuery. 

• Extensive experience providing practical direction with using GCP Native services. 

• Extensive hands-on experience implementing data ingestion, ETL, and data processing using GCP services: Google Cloud Storage (GCS), Dataflow, Cloud Functions, Cloud Composer, BigQuery, Cloud SQL, Pub/Sub, IoT Core, Dataproc, Dataprep, Bigtable, Firestore, etc. 

• Minimum of 2+ years of hands-on experience in GCP and Big Data technologies such as Java, Python, SQL, GCS, Apache Beam, PySpark, and SparkSQL, Dataproc, and live streaming technologies such as Pub/Sub, Dataflow, etc. 

• Well versed in DevSecOps and CI/CD deployments. 

• Cloud migration methodologies and processes including tools like Cloud Dataflow and Database Migration Service. 

• Minimum of 2+ years of RDBMS experience. 

• Experience in using Big Data File Formats and compression techniques. 

• Experience working with Developer tools such as Cloud Build, IntelliJ IDEA, Git, Jenkins, etc. 

• Experience with private and public cloud architectures, pros/cons, and migration considerations.

Primary Roles and Responsibilities:-

A Google Cloud Data Engineer is responsible for designing, building, and maintaining the data infrastructure for an organization using Google Cloud Platform (GCP) services. This includes creating data pipelines, integrating data from various sources, and implementing data security and privacy measures. The Google Cloud Data Engineer will also be responsible for monitoring and troubleshooting data flows and optimizing data storage and processing for performance and cost efficiency.

  Preferred Skills:-

       • DevOps on a Google Cloud Platform (GCP). 

     • Experience developing and deploying ETL solutions on Google Cloud. 

     • Familiarity with Google Cloud Data Catalog for metadata management, Data Governance, Data     Lineage, Data Catalog, etc.

    • Knowledge of Google Cloud IAM (Identity and Access Management). Understanding access controls     and security on Google Cloud. 

• Inclined with Google's vision and roadmap around the latest tools and technologies in the market. 

• Knowledge on Google Cloud's Vertex AI, Machine Learning capabilities, and industry supporting use cases. 

• Google Cloud certifications role-based (e.g., Professional Data Engineer, Professional Cloud Architect, Associate Cloud Engineer, Professional Machine Learning Engineer, etc.) 

• Google Looker Studio (formerly Google Data Studio) hands-on and working knowledge, to create reports/dashboards and generate insights for business users.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Architecture AWS Azure Big Data BigQuery Bigtable Bitbucket CI/CD Computer Science Consulting Dataflow Data governance Data pipelines Dataproc Data Studio Data warehouse DevOps ELT Engineering ETL GCP Git Google Cloud Java Jenkins Looker Machine Learning Pipelines Privacy PySpark Python RDBMS Security Spark SQL Streaming Vertex AI

Perks/benefits: Career development Flex hours Medical leave Parental leave

Region: Asia/Pacific
Country: India

More jobs like this