DE&A - Core - Cloud Data Engineering - Databricks

Mumbai, Maharashtra, India

Zensar

Zensar is a global organization which conceptualizes, builds, and manages digital products through experience design, data engineering, and advanced analytics for over 200 leading companies. Our solutions leverage industry-leading platforms to...

View all jobs at Zensar

Apply now Apply later

Work closely with end-users and Data Analysts to understand the business and their data requirements
Carry out ad hoc data analysis and ‘data wrangling’ using Synapse Analytics and Databricks
Building dynamic meta-data driven data ingestion patterns using Azure Data Factory and Databricks
Build and maintain the Enterprise Data Warehouse (using Data Vault 2.0 methodology)
Build and maintain business focused data products and data marts
Build and maintain Azure Analysis Services databases and cubes
Share support and operational duties within the wider engineering and data teams
Work with Architecture and Engineering teams to deliver on these projects. and ensure that supporting code and infrastructure follows best practices outlined by these teams. 
Help define test criteria to establish clear conditions for success and ensure alignment with business objectives.
Manage their user stories and acceptance criteria through to production into day-to-day support
Assist in the testing and validation of new requirements and processes to ensure they meet business needs
Stay up-to-date with industry trends and best practices in data engineering

 

Core skills and knowledge

Excellent data analysis and exploration using T-SQL
Strong SQL programming (stored procedures, functions)
Extensive experience with SQL Server and SSIS
Knowledge and experience of data warehouse modelling methodologies (Kimball, dimensional modelling, Data Vault 2.0)
Experience in Azure – one or more of the following: Data Factory, Databricks, Synapse Analytics, ADLS Gen2 
Experience in building robust and performant ETL processes
Build and maintain Analysis Services databases and cubes (both multidimensional and tabular)
Experience in using source control & ADO
Understanding and experience of deployment pipelines
Excellent analytical and problem-solving skills, with the ability to think critically and strategically.
Strong communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels.
To always act with integrity and embrace the philosophy of treating our customers fairly
Analytical, ability to arrive at solutions that fit current / future business processes
Effective writing and verbal communication
Organisational skills: Ability to effectively manage and co-ordinate themselves.
Ownership and self-motivation
Delivery focus
Assertive, resilient and persistent
Team oriented
Deal well with pressure and highly effective at multi-tasking and juggling priorities
Any other attributes that would be helpful, but not essential for the role.

Deeper programming ability (C#, .Net Core)
Build ‘infrastructure-as-code’ deployment pipelines
Asset Finance knowledge
Vehicle Finance knowledge
ABL and Working Capital knowledge
Any financial services and banking experience

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture Azure Banking Data analysis Databricks Data warehouse Engineering ETL Finance Pipelines SQL SSIS Testing T-SQL

Region: Asia/Pacific
Country: India

More jobs like this