Assistant Manager Digital
Asia Pacific-India-Karnataka-Bangalore
Kenvue
Everyday care is a powerful catalyst in making you feel better, inside and out. Learn about the iconic brands, products, people, and history that make up Kenvue.Description
LEAD ENGINEER (DIGITAL)
Location: Bangalore
Skills Required: Azure Data Factory, Data bricks, SQL , Azure MLOPS, Azure Devops.
Python, Pyspark, Scala
Good scripting and programming skills.
Experience : 5+ to 9 years
About Kenvue :
Kenvue is the world’s largest pure-play consumer health company by revenue. Built on more than a century of heritage, our iconic brands, including Aveeno®, Johnson’s®, Listerine®, and Neutrogena® are science-backed and recommended by healthcare professionals around the world. At Kenvue, we believe in the extraordinary power of everyday care and our teams work every day to put that power in consumers’ hands and earn a place in their hearts and homes.
Position Summary
- Subject Matter Expert for Azure Data, Analytics & AI with experience Azure architecture design preferred. Good understanding of migration and modernization strategies and approaches.
- Helping business migrate and/or build Data, Analytics & AI workloads throughout their Azure journey towards production and expanding the azure infrastructure usage
- Partnering with various Azure Engineering subject matter experts including the project Managers and business team to scope and build customer facing content, modules, tools and proof of concepts
- Staying current on latest Azure/Cloud innovations in order to conduct experiments, drive product improvement, and act as an Azure subject matter trainer for other engineers
- Development, customize and manage integration tools, databases, warehouses and analytical systems with the use of data related instruments/instances
- Create and run complex queries and automation scripts for operational data processing
- Test the reliability and performance of each part of a system and cooperate with the testing team
- Deploying data models into production environments. This entails providing the model with data stored in a warehouse or coming directly from sources, configuring data attributes, managing computing resources, setting up monitoring tools, etc.
- Responsible for setting up tools to view data, generate reports, and create visuals
- Monitoring the overall performance and stability of the system. Adjust and adapt the automated pipeline as data/models/requirements change.
- Mentor and train colleagues where necessary by helping them learn and improve their skills, as well as innovate and iterate on best practices.
- As per organizational & project priority responsibilities can be added later
Tasks/Duties/Responsibilities
- Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products.
- Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance.
- Responsible for implementing best practices around systems integration, security, performance and data management.
- Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape.
- Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions.
- Partner with the business, IT, and technical subject matter experts to ensure execution of enterprise-wide data engineering products & platform development.
- Implement DevOps, DataOps and Agile methodologies to improve KPIs like cycle times, consistency, and quality.
- Develop and optimize procedures to “Productionalize” data science models.
- Manage & support large-scale experimentation done by data scientists.
- Design and prototype new approaches and build solutions at scale.
- Research and cultivate in state-of-the-art data engineering methodologies.
- Establish practices of documentation for learnings and knowledge transfer.
- Design and audit reusable packages or libraries
- Understanding of cloud architecture principles and best practices
- Experience in designing end-to-end solutions that meet business requirements and adhere to scalability, reliability, and security standards.
- Familiarity with version control systems like Bitbucket, Jenkins and DevOps practices for CI/CD pipelines.
Required Qualification
Required Knowledge, Skills and Abilities:
(Include any required computer skills, certifications, licenses, languages, etc.)
Excellent understanding of ETL cycle
- Experience in the architecture and design of services using Azure architecture is added advantage (Azure Data factory, SQL, DataBricks)
- Azure DevOps, Azure ML Ops
- Azure data engineering certification is must
- Experience in using of Python/ PySpark and/or Scala for data engineering.
- Understanding of data types/ handling of different data models.
- Desirable to have experience with Spark, Kafka, Flask, Python
- Experience with the Microsoft Azure or AWS data management tools such as Azure Data factory, data lake, Synapse, Databricks, deployment of Web Application (Azure) is must
- Good scripting and programming skills.
Preferred Minimum Education: Bachelor of Engineering
Other: Azure Data Engineering Certification
Preferred Area of Study: Data Engineering, Databricks,
Preferred Related Industry Experience (if applicable): Manufacturing & Logistics, IT Service
Qualifications
Preferred Minimum Education: Bachelor of Engineering
Other: Azure Data Engineering Certification
Primary Location
Asia Pacific-India-Karnataka-BangaloreJob Function
Digital Product Development* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture AWS Azure Bitbucket Business Intelligence CI/CD Databricks Data management DataOps Data pipelines Data quality DevOps Engineering ETL Flask Jenkins Kafka KPIs Machine Learning MLOps Pipelines PySpark Python Research Scala Security Spark SQL Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.