Senior Data Engineer (Data Bricks)
Pune - Maharashtra
Allata
A global strategic tech innovation consulting company with locations in Dallas, Phoenix, Boise, Argentina, and India. Click here to learn more!
Allata is a global consulting and technology services firm with offices in the US, India, and Argentina. We help organizations accelerate growth, drive innovation, and solve complex challenges by combining strategy, design, and advanced technology. Our expertise covers defining business vision, optimizing processes, and creating engaging digital experiences. We architect and modernize secure, scalable solutions using cloud platforms and top engineering practices.
Allata also empowers clients to unlock data value through analytics and visualization and leverages artificial intelligence to automate processes and enhance decision-making. Our agile, cross-functional teams work closely with clients, either integrating with their teams or providing independent guidance—to deliver measurable results and build lasting partnerships.
We are seeking a skilled Data Engineerwith strong expertise in Data Bricks and Delta Live Tables to guide and drive the evolution of our client's data ecosystem.
The ideal candidate show strong technical leadership and own hands-on execution, leading the design, migration, and implementation of robust data solutions while mentoring team members and collaborating with stakeholders to achieve enterprise-wide analytics goals.
Key Responsibilities
~ Collaborate in defining the overall architecture of the solution, with experience in designing, building, and maintaining reusable data products using Data Bricks, Delta Live Tables (DLT), Pyspark, and SQL. Migration of existing data pipelines to modern frameworks and ensure scalability and efficiency. ~ Develop the data infrastructure, pipeline architecture, and integration solutions while actively contributing to hands-on implementation. ~ Build and maintain scalable, efficient data processing pipelines and solutions for data-driven applications. ~ Monitor and ensure adherence to data security, privacy regulations, and compliance standards. ~ Troubleshoot and resolve complex data-related challenges and incidents in a timely manner. ~ Stay at the forefront of emerging trends and technologies in data engineering and advocate for their integration when relevant. Required Skills & Qualifications ~ Proven expertise in Data Bricks, Delta Live Tables, SQL, and Pyspark for processing and managing large data volumes. ~ Strong experience in designing and implementing dimensional models and medallion architecture. ~ Strong experience in designing and migrating existing databricks workspaces and models to Unity Catalog enabled workspaces. ~ Strong Experience creatinging and managing group Access Control Lists (ACL) and compute and governance policies in Databricks Unity Catalog. ~ Hands-on experience with modern data pipeline tools (e.g. AWS Glue, Azure Data Factory) and cloud platforms (e.g. Databricks). ~ Knowledge of cloud data lakes (e.g., Data Bricks Delta Lake, Azure Storage and/or AWS S3). ~ Demonstrated experience applying DevOps principles using Version Control and CICD for IaC and code base deployments(e.g. AzureDevops, Git, CI/CD) to data engineering projects. ~ Strong experience with batch and streaming data processing techniques and file compactization strategies. Nice-to-Have Skills ~ Familiarity with architectural best practices for building data lakes. ~ Hands on with additional Azure Services including, Message Queues, Service Bus, Cloud Storage, Virtual Cloud, Serverless ~ Compute, CloudSQL, OOP Languages and Frameworks ~ Experience with BI tools (e.g., Power BI, Tableau) and deploying data models. ~ Experience with Databricks Unity Catalog, i.e. configuring and managing data governance and access controls in a Delta Lake environment
At Allata, we value differences.
Allata is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Allata makes employment decisions without regard to race, color, creed, religion, age, ancestry, national origin, veteran status, sex, sexual orientation, gender, gender identity, gender expression, marital status, disability or any other legally protected category.
This policy applies to all terms and conditions of employment, including but not limited to, recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
Allata also empowers clients to unlock data value through analytics and visualization and leverages artificial intelligence to automate processes and enhance decision-making. Our agile, cross-functional teams work closely with clients, either integrating with their teams or providing independent guidance—to deliver measurable results and build lasting partnerships.
We are seeking a skilled Data Engineerwith strong expertise in Data Bricks and Delta Live Tables to guide and drive the evolution of our client's data ecosystem.
The ideal candidate show strong technical leadership and own hands-on execution, leading the design, migration, and implementation of robust data solutions while mentoring team members and collaborating with stakeholders to achieve enterprise-wide analytics goals.
Key Responsibilities
~ Collaborate in defining the overall architecture of the solution, with experience in designing, building, and maintaining reusable data products using Data Bricks, Delta Live Tables (DLT), Pyspark, and SQL. Migration of existing data pipelines to modern frameworks and ensure scalability and efficiency. ~ Develop the data infrastructure, pipeline architecture, and integration solutions while actively contributing to hands-on implementation. ~ Build and maintain scalable, efficient data processing pipelines and solutions for data-driven applications. ~ Monitor and ensure adherence to data security, privacy regulations, and compliance standards. ~ Troubleshoot and resolve complex data-related challenges and incidents in a timely manner. ~ Stay at the forefront of emerging trends and technologies in data engineering and advocate for their integration when relevant. Required Skills & Qualifications ~ Proven expertise in Data Bricks, Delta Live Tables, SQL, and Pyspark for processing and managing large data volumes. ~ Strong experience in designing and implementing dimensional models and medallion architecture. ~ Strong experience in designing and migrating existing databricks workspaces and models to Unity Catalog enabled workspaces. ~ Strong Experience creatinging and managing group Access Control Lists (ACL) and compute and governance policies in Databricks Unity Catalog. ~ Hands-on experience with modern data pipeline tools (e.g. AWS Glue, Azure Data Factory) and cloud platforms (e.g. Databricks). ~ Knowledge of cloud data lakes (e.g., Data Bricks Delta Lake, Azure Storage and/or AWS S3). ~ Demonstrated experience applying DevOps principles using Version Control and CICD for IaC and code base deployments(e.g. AzureDevops, Git, CI/CD) to data engineering projects. ~ Strong experience with batch and streaming data processing techniques and file compactization strategies. Nice-to-Have Skills ~ Familiarity with architectural best practices for building data lakes. ~ Hands on with additional Azure Services including, Message Queues, Service Bus, Cloud Storage, Virtual Cloud, Serverless ~ Compute, CloudSQL, OOP Languages and Frameworks ~ Experience with BI tools (e.g., Power BI, Tableau) and deploying data models. ~ Experience with Databricks Unity Catalog, i.e. configuring and managing data governance and access controls in a Delta Lake environment
At Allata, we value differences.
Allata is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Allata makes employment decisions without regard to race, color, creed, religion, age, ancestry, national origin, veteran status, sex, sexual orientation, gender, gender identity, gender expression, marital status, disability or any other legally protected category.
This policy applies to all terms and conditions of employment, including but not limited to, recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Job stats:
0
0
0
Category:
Engineering Jobs
Tags: Agile Architecture AWS AWS Glue Azure CI/CD Consulting Databricks Data governance Data pipelines DevOps Engineering Git OOP Pipelines Power BI Privacy PySpark Security SQL Streaming Tableau
Region:
Asia/Pacific
Country:
India
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.
Data Engineer II jobsBI Developer jobsPrincipal Data Engineer jobsStaff Data Scientist jobsSr. Data Engineer jobsPrincipal Software Engineer jobsStaff Machine Learning Engineer jobsData Science Manager jobsDevOps Engineer jobsData Science Intern jobsSoftware Engineer II jobsData Manager jobsJunior Data Analyst jobsData Analyst Intern jobsLead Data Analyst jobsAccount Executive jobsStaff Software Engineer jobsData Specialist jobsBusiness Data Analyst jobsSenior Backend Engineer jobsAI/ML Engineer jobsBusiness Intelligence Analyst jobsData Governance Analyst jobsSr. Data Scientist jobsData Engineer III jobs
Consulting jobsAirflow jobsOpen Source jobsMLOps jobsLinux jobsKPIs jobsEconomics jobsJavaScript jobsTerraform jobsRDBMS jobsKafka jobsData Warehousing jobsNoSQL jobsGitHub jobsPostgreSQL jobsGoogle Cloud jobsComputer Vision jobsClassification jobsScikit-learn jobsPhysics jobsStreaming jobsHadoop jobsBanking jobsR&D jobsPrompt engineering jobs