Senior Consultant | Azure Data Engineer | Bengaluru | Engineering

Bengaluru, IN

Deloitte

Insights zu unseren Diensleistungen im Bereich Audit, Consulting, Financial Advisory, Risk Adivisory und Tax sowie unseren zahlreichen Industrien.

View all jobs at Deloitte

Apply now Apply later

We are seeking an experienced Azure Data Engineer to join our team as a Senior Consultant. In this role, you will focus on implementing and developing robust data solutions using Microsoft Azure technologies. You will work closely with cross-functional teams to deliver high-quality data platforms that drive business value for our clients.


Key Responsibilities


1. Azure Data Solution Development

 - Develop scalable, high-performance data solutions using Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure Data Lake Storage

 - Implement and optimize data pipelines for batch and real-time data processing

 - Write efficient and maintainable code for data transformation and integration processes


2. Data Integration and Transformation

 - Develop complex ETL/ELT processes to integrate data from various sources

 - Implement advanced data quality checks and data cleansing procedures

 - Create and optimize data models to support analytical and operational needs


3. Performance Optimization

 - Tune and optimize SQL queries and data processing jobs for maximum efficiency

 - Implement partitioning, indexing, and caching strategies to improve data access performance

 - Optimize Spark jobs and Databricks notebooks for large-scale data processing


4. Data Pipeline Automation

 - Develop automated data pipelines using Azure Data Factory and other Azure services

 - Implement error handling, logging, and monitoring within data pipelines

 - Create reusable components and parameterized templates for data workflows


5. Collaboration and Technical Leadership

 - Work closely with data scientists and analysts to implement data preparation and feature engineering processes

 - Provide technical guidance and mentorship to junior developers

 - Contribute to technical design discussions and architectural decisions


 Qualifications


- Bachelor's or Master's degree in Computer Science, Information Systems, or a related field

- 5+ years of experience in data engineering, with at least 3 years focused on Azure data technologies

- Strong proficiency in Azure data services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure Data Lake Storage

- Expert-level skills in SQL and experience with SQL performance tuning

- Advanced proficiency in Python, Scala, or PySpark for data processing and transformation

- Extensive experience with big data technologies like Hadoop, Spark, and Hive

- Strong understanding of data warehousing concepts and dimensional modeling

- Experience with version control systems (e.g., Git) and CI/CD pipelines for data projects

- Excellent problem-solving skills and attention to detail

- Strong communication skills and ability to work effectively in a team environment

- Azure certifications (e.g., Azure Data Engineer Associate) are highly desirable


 Additional Skills


- Experience with Delta Lake and data lakehouse architectures

- Familiarity with stream processing technologies (e.g., Azure Stream Analytics, Kafka)

- Knowledge of data governance and data quality management practices

- Experience with Azure Synapse Analytics and its integration with other Azure services

- Familiarity with Agile development methodologies

- Understanding of machine learning concepts and experience in implementing ML pipeliness is an added bonus

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Architecture Azure Big Data CI/CD Computer Science Databricks Data governance Data pipelines Data quality Data Warehousing ELT Engineering ETL Feature engineering Git Hadoop Kafka Machine Learning Pipelines PySpark Python Scala Spark SQL

Region: Asia/Pacific
Country: India

More jobs like this