Azure Data Engineer
Bengaluru Campus 31
Haleon
We are a world-leading consumer health company with a clear purpose to deliver better everyday health with humanity.Hello. We’re Haleon. A new world-leading consumer health company. Shaped by all who join us. Together, we’re improving everyday health for billions of people. By growing and innovating our global portfolio of category-leading brands – including Sensodyne, Panadol, Advil, Voltaren, Theraflu, Otrivin, and Centrum – through a unique combination of deep human understanding and trusted science. What’s more, we’re achieving it in a company that we’re in control of. In an environment that we’re co-creating. And a culture that’s uniquely ours. Care to join us. It isn’t a question.
With category leading brands such as Sensodyne, Voltaren and Centrum, built on trusted science and human understanding, and combined with our passion, knowledge and expertise, we’re uniquely placed to do this and to grow a strong, successful business.
This is an exciting time to join us and help shape the future. It’s an opportunity to be part of something special.
About the role
You will be working on the latest data platform technologies, and a part of a high-performing team fully committed to employee growth and development. If you’re a skilled engineer who wants to build using the latest technology and acquire new skills in data and analytics, then come join our team.
You will be integrating components, designing frameworks, and building reusable data solutions for Haleon’s Enterprise Data & Analytics Platform. You will enable faster acquisition, ingestion, and curation of analytical-ready datasets.
This is very hands-on development role which includes developing & delivering code through from origin to production, plus working in partnership with 3rd party development service providers to help ensure that code comes in on time, to quality and in line with the overall ecosystem being established.
The Data Engineer will directly contribute to the extensive and varied build and deployment activities involved in establishing the new platform then continue to work on the already significant and growing pipeline of future buildouts of platform services on the Enterprise Data & Analytics Platform.
Key responsibilities
Development
• Hands on, sleeves up development and delivery code through from origin to production, plus working in partnership with 3rd party development service providers to help ensure that performant/quality code comes is written and in line with the overall ecosystem being established
• The Data Engineer will directly contribute to the extensive and varied build and deployment activities involved in establishing the new products/platform then continue to work on the already significant and growing pipeline of future buildouts of platform services on the Enterprise Data & Analytics Platform.
Qualifications and Essential Skills
• MS/BS degree in Computer Science, Engineering, Data Science or 3 to 6 years equivalent experience, with preference on experience and proven track record. Ideal candidate would have an impressive hands-on work history in an advanced, recognized, and innovative environment.
• Data engineering experience and seasoned coder in the relevant languages: Pyspark, SQL, Python, etc
• Experience with the Azure data and analytics stack: Azure Databricks, Azure Data Factory (SHIR, etc), Synapse, Keyvault, LogicApp, Unity catalog (UC), ADLS Gen2, etc.
• Experience with Agile (in a highly structured and robust reporting delivery management methodology format) delivery frameworks and tools: SAFe, Jira, Confluence, Azure DevOps (Pipelines, self-hosted agent, etc), Github, delivery break down estimation, etc.
• Fully conversant with big-data processing approaches and “schema-on-read” methodologies. Preference for deep understanding of Spark, Databricks and Delta Lake, and applying them to solve data science and machine learning business problems.
• Fluency with Data/Platform/Reliability Engineering patterns for operational resilience & co-contribution development patterns.
• Ability to develop and optimise the Spark code for large volumes of Data.
• Familiar deploying enterprise analytics solutions at scale with applicable services: administration, qualification, and user access provisioning.
• Experience articulating business value of analytics projects and progressing solutions from MVP to scaled-up production solutions.
• Production experience delivering CI/CD pipelines across Azure and vendor products.
• Knowledge of Data modelling, Purview and their application.
• Ability to work in close partnership with groups across the IT organization (security, compliance, infrastructure, etc.) and business stakeholders in the commercial organizations.
• Ability to develop and maintain productive working relationships with suppliers and specialist technology providers to assemble and maintain a distinctive and flexible mix of capabilities against future requirements.
• Ideal candidate possesses great communication skills and the ability to communicate inherently complicated technical concepts to non-technical stakeholders of all levels.
Good To Have Skills
• Knowledge of Github Actions, PowerBI, PowerApp, Azure Synapse and Scala.
• Experience with visualization tools and their application: developing reports, dashboards and KPI scorecards.
Delivery
• Ensure project goals are achieved on time in alignment with the stakeholders’ expectation.
• Ability to work on complex projects and in a distributed environment.
• Escalate when necessary and in a timely manner.
• Work in close collaboration with other team members in the Enterprise Data & Analytics Platform team, to ensure Development/Delivery aspects are well represented in the project’s requirements and deliverables.
Methodology
• Incorporate agile ways of working into the delivery process utilising DABL (Discovery, Alpha, Beta, Launch)
• Individuals will work as part of product-centric delivery team(s) that will focus on delivering value independently while fully embracing integrated DevOps approaches.
Ownership
• Take ownership for the delivery/development projects and help steer until completion
.
Governance
• Maintain governance that allows projects and stakeholders to manage overall project performance and manage programme risks within the global nature of some of the programmes.
Forward looking
• Remain flexible towards technology approaches to ensure we are taking advantage of new technologies.
• Keep abreast of industry developments in analytics and be able to interpret how these would impact services and present new opportunities.
Quality, Risk & Compliance
• Ensure all risk and issues associated with owned projects are recorded and managed in the appropriate Risk & Issue logs in a timely manner.
• Ensure all Risks and Issues have clear action/mitigation/contingency plans defined, with named action owners and timelines for completion.
Technical Architecture
• Be conversant with technical architecture to contribute to design discussions in partnership with the Delivery/Development Lead and dedicated Analytics & Data Architect.
Care to join us. Find out what life at Haleon is really like www.haleon.com/careers/
At Haleon we embrace our diverse workforce by creating an inclusive environment that celebrates our unique perspectives, generates curiosity to create unmatched understanding of each other, and promotes fair and equitable outcomes for everyone. We're striving to create a climate where we celebrate our diversity in all forms by treating each other with respect, listening to different viewpoints, supporting our communities, and creating a workplace where your authentic self belongs and thrives. We believe in an agile working culture for all our roles. If flexibility is important to you, we encourage you to explore with our hiring team what the opportunities are.
As you apply, we will ask you to share some personal information, which is entirely voluntary. We want to have an opportunity to consider a diverse pool of qualified candidates and this information will assist us in meeting that objective and in understanding how well we are doing against our inclusion and diversity ambitions. We would really appreciate it if you could take a few moments to complete it. Rest assured, Hiring Managers do not have access to this information and we will treat your information confidentially.
Haleon is an Equal Opportunity Employer. All qualified applicants will receive equal consideration for employment without regard to race, color, national origin, religion, sex, pregnancy, marital status, sexual orientation, gender identity/expression, age, disability, genetic information, military service, covered/protected veteran status or any other federal, state or local protected class.
Please note that if you are a US Licensed Healthcare Professional or Healthcare Professional as defined by the laws of the state issuing your license, Haleon may be required to capture and report expenses Haleon incurs, on your behalf, in the event you are afforded an interview for employment. This capture of applicable transfers of value is necessary to ensure Haleon’s compliance to all federal and state US Transparency requirements.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture Azure CI/CD Computer Science Confluence Databricks DevOps Engineering GitHub Jira Machine Learning MVP Pipelines Power BI PySpark Python Scala Security Spark SQL
Perks/benefits: Career development Flex hours Startup environment Team events Transparency
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.