Senior Engineer (SA1 DES - Azure Data Engineer)

Bangalore, Karnataka, India

KPMG India

Welcome to KPMG International.

View all jobs at KPMG India

Apply now Apply later

Roles & responsibilities

Role Overview: The Senior Associate 1 - “Azure Data Engineer with Gen AI/Fabric skills” will be part of the GDC Technology Solutions (GTS) team, working in a technical role in the Audit Data & Analytics domain that requires developing expertise in KPMG proprietary D&A (Data and analytics)) tools and audit methodology. He/she will be a part of the team responsible for extracting and processing datasets from client ERP systems (SAP/Oracle/Microsoft Dynamics) or other sources to provide insights through data warehousing, ETL and dashboarding solutions to Audit/internal teams and be involved in developing solutions using a variety of tools & technologies

The Senior Associate 1 - “Azure Data Engineer” will be predominantly responsible for:

Data Engineering (Primary Skills)

·Azure data engineer expertise having experience working on all major Azure resources including expertise in development and debugging of programs using Databricks expertise. ·Utilize Azure Databricks notebooks to build and manage data transformations, create tables, and ensure data quality and consistency. Leverage Unity Catalog for data governance and maintaining a unified data view across the organization. ·Design, develop, and maintain scalable and efficient data pipelines to process large datasets from various sources using Azure Data Factory (ADF). ·Integrate data from multiple data sources and ensure data consistency, quality, and accuracy, leveraging Azure Data Lake Storage (ADLS). ·Develop and manage data warehouses to store and organize large volumes of structured and unstructured data using Azure Synapse Analytics and other relevant Azure services. ·Design and implement ETL (Extract, Transform, Load) processes to ensure seamless data flow across systems using Azure cloud platform ·Work experience on Microsoft Fabric experience is an added advantage ·Professional should be enthusiastic to learn, adapt and integrate Gen AI into the business process and should have experience working in Azure AI services ·Optimize data storage and retrieval processes to enhance system performance and reduce latency.

Power BI (Secondary Skills)

·Design and develop visuals, reports, and dashboard solutions in Power BI with RLS and other features as per business needs. ·Hands on experience connecting to various data sources to load and transform data and Implement complex joins with multiple tables, using power query. ·Highly skilled in writing DAX expressions to implement complex business calculations and data modeling needs. ·Capable of developing standard data models like star schema or snowflake schema considering multiple fact and dimension tables. ·Strong understanding of data structures, Data Analysis skills in identifying different trends, patterns, and other data issues within dataset and a good story telling skill using the given data. ·Prepare technical specifications and documentation for solutions. ·Understand business process & translate business logic into visual representations in a quick and accurate manner, perform end to end data validations. ·Being proactive and engaged in bringing new ideas and solutions  

Roles & responsibilities

Role Overview: The Senior Associate 1 - “Azure Data Engineer with Gen AI/Fabric skills” will be part of the GDC Technology Solutions (GTS) team, working in a technical role in the Audit Data & Analytics domain that requires developing expertise in KPMG proprietary D&A (Data and analytics)) tools and audit methodology. He/she will be a part of the team responsible for extracting and processing datasets from client ERP systems (SAP/Oracle/Microsoft Dynamics) or other sources to provide insights through data warehousing, ETL and dashboarding solutions to Audit/internal teams and be involved in developing solutions using a variety of tools & technologies

The Senior Associate 1 - “Azure Data Engineer” will be predominantly responsible for:

Data Engineering (Primary Skills)

·Azure data engineer expertise having experience working on all major Azure resources including expertise in development and debugging of programs using Databricks expertise. ·Utilize Azure Databricks notebooks to build and manage data transformations, create tables, and ensure data quality and consistency. Leverage Unity Catalog for data governance and maintaining a unified data view across the organization. ·Design, develop, and maintain scalable and efficient data pipelines to process large datasets from various sources using Azure Data Factory (ADF). ·Integrate data from multiple data sources and ensure data consistency, quality, and accuracy, leveraging Azure Data Lake Storage (ADLS). ·Develop and manage data warehouses to store and organize large volumes of structured and unstructured data using Azure Synapse Analytics and other relevant Azure services. ·Design and implement ETL (Extract, Transform, Load) processes to ensure seamless data flow across systems using Azure cloud platform ·Work experience on Microsoft Fabric experience is an added advantage ·Professional should be enthusiastic to learn, adapt and integrate Gen AI into the business process and should have experience working in Azure AI services ·Optimize data storage and retrieval processes to enhance system performance and reduce latency.

Power BI (Secondary Skills)

·Design and develop visuals, reports, and dashboard solutions in Power BI with RLS and other features as per business needs. ·Hands on experience connecting to various data sources to load and transform data and Implement complex joins with multiple tables, using power query. ·Highly skilled in writing DAX expressions to implement complex business calculations and data modeling needs. ·Capable of developing standard data models like star schema or snowflake schema considering multiple fact and dimension tables. ·Strong understanding of data structures, Data Analysis skills in identifying different trends, patterns, and other data issues within dataset and a good story telling skill using the given data. ·Prepare technical specifications and documentation for solutions. ·Understand business process & translate business logic into visual representations in a quick and accurate manner, perform end to end data validations. ·Being proactive and engaged in bringing new ideas and solutions

Job Requirements

Technical Skills

Primary Skills:

ØMinimum 4-6 years of experience in Data Engineering. ØProficiency in SQL, Python or Pyspark notebooks development. ØStrong knowledge of ETL tools and processes. ØHands-on experience with Azure Databricks, Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), and. ØComprehensive knowledge of Azure cloud services. ØExperience with Databricks notebooks for building transformations and creating tables

Secondary Skills:

ØExpertise in Power BI, including DAX and Power Query.

 Enabling Skills

·Excellent analytical, problem solving and troubleshooting abilities ·Critical thinking: able to look at numbers, trends and data and come to new conclusions based on findings ·Attention to detail and good team player ·Quick learning ability and adaptability ·Willingness and capability to deliver within tight timelines ·Effective communication skills ·Flexible to work timings and willingness to work in different projects/technologies ·Collaborate with business stakeholders to understand data requirements and deliver solutions 

Education Requirements

·B. Tech/B.E/MCA (Computer Science / Information Technology)
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Azure Computer Science Data analysis Databricks Data governance Data pipelines Data quality Data Warehousing Engineering ETL Generative AI Oracle Pipelines Power BI PySpark Python Snowflake SQL Unstructured data

Perks/benefits: Flex hours

Region: Asia/Pacific
Country: India

More jobs like this