Senior Data Engineer

Asia Pacific-India-Karnataka-Bangalore

Kenvue

Everyday care is a powerful catalyst in making you feel better, inside and out. Learn about the iconic brands, products, people, and history that make up Kenvue.

View all jobs at Kenvue

Apply now Apply later

Senior Data Engineer-2407027395W

Description

 

Who We Are

At Kenvue, we realize the extraordinary power of everyday care. Built on over a century of heritage and rooted in science, we’re the house of iconic brands - including NEUTROGENA®, AVEENO®, TYLENOL®, LISTERINE®, JOHNSON’S® and BAND-AID® that you already know and love. Science is our passion; care is our talent. Our global team is made by 22,000 diverse and brilliant people, passionate about insights, innovation and committed to deliver the best products to our customers. With expertise and empathy, being a Kenvuer means to have the power to impact life of millions of people every day. We put people first, care fiercely, earn trust with science and solve with courage – and have brilliant opportunities waiting for you! Join us in shaping our future–and yours.

 

What You Will Do 

The Senior Data Engineer is responsible for Software Development Engineering and performing work in the following areas:•Data Engineering & Data Modelling: •Forge trusted partnerships and be an integral part of Data Engineering, Data Architecture & Platforms and Data Science teams to adopt, scale and build data products •Data Warehouse Development & Administration: Designing, developing, implementing, and maintaining a data warehouse to integrate data from various sources/systems within an organization •Developing strategies for data acquisition, archive recovery, and database implementation •Managing data migrations/conversions and troubleshooting data processing issues •Focus on execution & delivery of highly reliable, high-quality data pipelines aimed at maximizing business value through data products• building and improving next generation data & analytics capabilities within DevSecOps framework•Work closely with the Business Analytics leaders to understand needs of the business; clearly articulating the story of value created through data & technology•Drive prioritization and implementation of most appropriate combination of data engineering methodologies & frameworks to ensure optimal scalability, flexibility and performance of platforms, products & solutions •Conducting requirements gathering and analysis to understand the domain of the software problem and/or functionality, the interfaces between hardware and software, and the overall software characteristics •Using programming, scripting, and/or database languages to write the software code •Supporting software testing, deployment, maintenance, and evolution activities by correcting programming errors, responding to scope changes, and coding software enhancements

Key Responsibilities

·   Main accountabilities includes: •Designing, building, and maintaining data products and pipelines •Developing and deploying data models •Monitoring and troubleshooting data systems •Working with product owners, data scientists and other stakeholders to understand the business needs and to build data solutions that meet those needs •Keeping up with the latest trends in data engineering •Providing training on data engineering concepts and techniques to other members of the team •Writing documentation for data pipelines and data models •Participating in data governance initiatives.

What We Are Looking For

Required Qualifications

·        Typically requires a minimum of 5 years of related experiences with a Bachelor's degree in Computer or Software Engineering; or 3 years and a Master degree in Computer or Software Engineering;

·        Demonstrated strength in examining issues, driving resolution & influencing effectively across the organization. Ability to challenge the status quo in technology & architecture.

·        Superb interpersonal & communication skills (oral and written), including the ability to explain digital concepts and technologies to business leaders, as well as business concepts to technologists.

·        3-5 years of progressive experience with developing full stack Data Frameworks: ETL/ELT, data analysis, compute & storage, data pipelines, orchestration

·        Minimum of 3 years hands-on experience in Cloud Architecture (Azure, GCP, AWS) & cloud-based databases (Synapse, Databricks, Snowflake, Redshift) and various data integration techniques (API, stream, file) using DBT, SQL/PySpark, Python.

·        3+ years implementing data pipelines enabling data products through data mesh / fabric concepts

·        Ability define data pipelines to address data challenges: different granularity, data gaps, sophisticated matching logic, multi-language, structured & unstructured; harmonization & normalization of data.

·        Proven track record leading multiple high-profile projects with demanding deadlines, changing requirements, and working with defined resources. Ability to estimate required effort and prioritize work items appropriately.

·        Thrives on a diverse company culture, celebrating the uniqueness of our employees and committed to inclusion. Proud to be an equal opportunity employer.

Qualifications

 

Desired Qualifications

·        Convervant with Data & Analytics product management, Azure, SQL, Data Catalog

·        Experience with unstructured data processing and real-time data pipelines

·        Preferably from Retail or CPG industry

 

Primary Location

 Asia Pacific-India-Karnataka-Bangalore

Job Function

 Architecture
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: APIs Architecture AWS Azure Business Analytics Data analysis Databricks Data governance Data pipelines Data warehouse dbt ELT Engineering ETL GCP Pipelines PySpark Python Redshift Snowflake SQL Testing Unstructured data

Perks/benefits: Team events

Region: Asia/Pacific
Country: India

More jobs like this