Senior Engineer (SA1 PD - PBI, Databricks, MS BI)

Bangalore, Karnataka, India

KPMG India

Welcome to KPMG International.

View all jobs at KPMG India

Apply now Apply later

Roles & responsibilities

Role Overview: The Senior Associate 1 - “Data Engineer with Power BI expertise” will be part of the GDC Technology Solutions (GTS) team, working in a technical role in the Audit Data & Analytics domain that requires developing expertise in KPMG proprietary D&A (Data and analytics)) tools and audit methodology. He/she will be a part of the team responsible for extracting and processing datasets from client ERP systems (SAP/Oracle/Microsoft Dynamics) or other sources to provide insights through data warehousing, ETL and dashboarding solutions to Audit/internal teams and be involved in developing solutions using a variety of tools & technologies.

The Senior Associate 1 - “Data Engineer with Power BI expertise” will be predominantly responsible for:

Data Engineering

·Design, develop, and maintain scalable and efficient data pipelines to process large datasets from various sources using Azure Data Factory (ADF) and Databricks. ·Design, implement, and optimize end-to-end data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data. ·Integrate data from multiple data sources and ensure data consistency, quality, and accuracy, leveraging Azure Data Lake Storage (ADLS). ·Develop and manage data warehouses to store and organize large volumes of structured and unstructured data using Azure Synapse Analytics and other relevant Azure services. ·Design and implement ETL (Extract, Transform, Load) processes to ensure seamless data flow across systems. ·Design and maintain data models, schemas, and database structures to support analytical and operational use cases. ·Optimize data storage and retrieval mechanisms for performance and scalability. ·Evaluate and implement data storage solutions, including relational databases, NoSQL databases, data lakes, and cloud storage services. ·Build and maintain integrations with internal and external data sources and APIs. ·Implement RESTful APIs and web services for data access and consumption. ·Ensure compatibility and interoperability between different systems and platforms ·Optimize data storage and retrieval processes to enhance system performance and reduce latency. ·Experience with Databricks notebooks for building transformations and creating tables. ·Utilize Databricks notebooks to build and manage data transformations, create tables, and ensure data quality and consistency. Leverage Unity Catalog for data governance and maintaining a unified data view across the organization.

Power BI

·Design and develop visuals, reports, and dashboard solutions in Power BI with RLS and other features as per business needs. ·Hands on experience connecting to various data sources to load and transform data and Implement complex joins with multiple tables, using power query. ·Highly skilled in writing DAX expressions to implement complex business calculations and data modeling needs. ·Capable of developing standard data models like star schema or snowflake schema considering multiple fact and dimension tables. ·Strong understanding of data structures, Data Analysis skills in identifying different trends, patterns, and other data issues within dataset and a good story telling skill using the given data. ·Prepare technical specifications and documentation for solutions. ·Understand business process & translate business logic into visual representations in a quick and accurate manner, perform end to end data validations.

Being proactive and engaged in bringing new ideas and solutions

Job Requirements

Technical Skills

ØMinimum 4 to 6 years of experience in data engineering or related roles, with a strong background in designing and implementing data pipelines and infrastructure and Power BI development. ØProficiency in SQL, Python, or other programming languages like Java, or Scala. ØHands-on experience with cloud platforms such as AWS, Azure, or GCP ØExperience with data modeling, ETL tools, and workflow orchestration frameworks ØExpertise in Power BI, including DAX and Power Query. ØHands-on experience with Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), and Databricks. ØExperience with big data technologies such as Hadoop, Spark, or Kafka ØFamiliarity with data warehousing concepts and technologies (e.g., Redshift, BigQuery, Snowflake) ØStrong SQL skills and experience working with relational and non-relational databases ØExperience with Databricks notebooks for building transformations and creating tables 

Enabling Skills

·Excellent analytical, problem solving and troubleshooting abilities ·Critical thinking: able to look at numbers, trends and data and come to new conclusions based on findings ·Attention to detail and good team player ·Quick learning ability and adaptability ·Willingness and capability to deliver within tight timelines ·Effective communication skills ·Flexible to work timings and willingness to work in different projects/technologies

Collaborate with business stakeholders to understand data requirements and deliver solutions

Roles & responsibilities

Role Overview: The Senior Associate 1 - “Data Engineer with Power BI expertise” will be part of the GDC Technology Solutions (GTS) team, working in a technical role in the Audit Data & Analytics domain that requires developing expertise in KPMG proprietary D&A (Data and analytics)) tools and audit methodology. He/she will be a part of the team responsible for extracting and processing datasets from client ERP systems (SAP/Oracle/Microsoft Dynamics) or other sources to provide insights through data warehousing, ETL and dashboarding solutions to Audit/internal teams and be involved in developing solutions using a variety of tools & technologies.

The Senior Associate 1 - “Data Engineer with Power BI expertise” will be predominantly responsible for:

Data Engineering

·Design, develop, and maintain scalable and efficient data pipelines to process large datasets from various sources using Azure Data Factory (ADF) and Databricks. ·Design, implement, and optimize end-to-end data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data. ·Integrate data from multiple data sources and ensure data consistency, quality, and accuracy, leveraging Azure Data Lake Storage (ADLS). ·Develop and manage data warehouses to store and organize large volumes of structured and unstructured data using Azure Synapse Analytics and other relevant Azure services. ·Design and implement ETL (Extract, Transform, Load) processes to ensure seamless data flow across systems. ·Design and maintain data models, schemas, and database structures to support analytical and operational use cases. ·Optimize data storage and retrieval mechanisms for performance and scalability. ·Evaluate and implement data storage solutions, including relational databases, NoSQL databases, data lakes, and cloud storage services. ·Build and maintain integrations with internal and external data sources and APIs. ·Implement RESTful APIs and web services for data access and consumption. ·Ensure compatibility and interoperability between different systems and platforms ·Optimize data storage and retrieval processes to enhance system performance and reduce latency. ·Experience with Databricks notebooks for building transformations and creating tables. ·Utilize Databricks notebooks to build and manage data transformations, create tables, and ensure data quality and consistency. Leverage Unity Catalog for data governance and maintaining a unified data view across the organization.

Power BI

·Design and develop visuals, reports, and dashboard solutions in Power BI with RLS and other features as per business needs. ·Hands on experience connecting to various data sources to load and transform data and Implement complex joins with multiple tables, using power query. ·Highly skilled in writing DAX expressions to implement complex business calculations and data modeling needs. ·Capable of developing standard data models like star schema or snowflake schema considering multiple fact and dimension tables. ·Strong understanding of data structures, Data Analysis skills in identifying different trends, patterns, and other data issues within dataset and a good story telling skill using the given data. ·Prepare technical specifications and documentation for solutions. ·Understand business process & translate business logic into visual representations in a quick and accurate manner, perform end to end data validations.

Being proactive and engaged in bringing new ideas and solutions

Job Requirements

Technical Skills

ØMinimum 4 to 6 years of experience in data engineering or related roles, with a strong background in designing and implementing data pipelines and infrastructure and Power BI development. ØProficiency in SQL, Python, or other programming languages like Java, or Scala. ØHands-on experience with cloud platforms such as AWS, Azure, or GCP ØExperience with data modeling, ETL tools, and workflow orchestration frameworks ØExpertise in Power BI, including DAX and Power Query. ØHands-on experience with Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), and Databricks. ØExperience with big data technologies such as Hadoop, Spark, or Kafka ØFamiliarity with data warehousing concepts and technologies (e.g., Redshift, BigQuery, Snowflake) ØStrong SQL skills and experience working with relational and non-relational databases ØExperience with Databricks notebooks for building transformations and creating tables 

Enabling Skills

·Excellent analytical, problem solving and troubleshooting abilities ·Critical thinking: able to look at numbers, trends and data and come to new conclusions based on findings ·Attention to detail and good team player ·Quick learning ability and adaptability ·Willingness and capability to deliver within tight timelines ·Effective communication skills ·Flexible to work timings and willingness to work in different projects/technologies

Collaborate with business stakeholders to understand data requirements and deliver solutions

Education Requirements

·B. Tech/B.E/MCA (Computer Science / Information Technology) Minimum 4 to 6 years of experience in data engineering or related roles, with a strong background in designing and implementing data pipelines and infrastructure and Power BI development.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: APIs AWS Azure Big Data BigQuery Computer Science Data analysis Databricks Data governance Data pipelines Data quality Data Warehousing Engineering ETL GCP Hadoop Java Kafka NoSQL Oracle Pipelines Power BI Python RDBMS Redshift Scala Snowflake Spark SQL Unstructured data

Perks/benefits: Flex hours

Region: Asia/Pacific
Country: India

More jobs like this