Data Engineer
Latin America-Brazil-São Paulo-São José dos Campos
Kenvue
Everyday care is a powerful catalyst in making you feel better, inside and out. Learn about the iconic brands, products, people, and history that make up Kenvue.Description
Kenvue is currently recruiting for:
Data Engineer
This position reports into an Technical Data Engineer Leader and is based at Sao Jose dos Campos/SP or Sao Paulo/SP – Brazil.
Who We Are
At Kenvue, we realize the extraordinary power of everyday care. Built on over a century of heritage and rooted in science, we’re the house of iconic brands - including NEUTROGENA®, AVEENO®, TYLENOL®, LISTERINE®, JOHNSON’S® and BAND-AID® that you already know and love. Science is our passion; care is our talent. Our global team is made by 22,000 diverse and brilliant people, passionate about insights, innovation and committed to deliver the best products to our customers. With expertise and empathy, being a Kenvuer means to have the power to impact life of millions of people every day. We put people first, care fiercely, earn trust with science and solve with courage – and have brilliant opportunities waiting for you! Join us in shaping our future–and yours.
What You Will Do
We are looking for a talented Mid-Level Data Engineer to join our innovative team. The ideal candidate will have a solid understanding of data modeling, good proficiency in Python, experience with Snowflake, SQL, and Databricks, and a drive to build and maintain scalable data pipelines.
Key Responsibilities
· Assist in designing, implementing, and maintaining data pipelines and ETL processes to meet business needs.
· Contribute to the development and optimization of data models for both operational and analytical purposes.
· Collaborate with cross-functional teams to gather and understand data requirements.
· Use Python for data manipulation and pipeline development under guidance from senior engineers.
· Work with Snowflake for data storage and retrieval, and assist in query optimization.
· Write SQL queries for data extraction, transformation, and loading.
· Participate in the implementation and optimization of data workflows using Databricks.
· Monitor data pipeline performance and assist in troubleshooting issues.
· Ensure data quality and consistency across various sources.
Qualifications
What We Are Looking For
Required Qualifications
· Bachelor’s Degree in Computer Science, Information Technology, or a related field.
· Years of Experience: 2-4 years in data engineering or related roles.
· Solid understanding of data modeling concepts and best practices.
· Proficiency in Python programming for data engineering tasks.
· Familiarity with Snowflake as a cloud data warehousing solution.
· Strong SQL skills for data querying and manipulation.
· Basic experience with Databricks for data processing.
Desired Qualifications
· Good problem-solving and analytical thinking abilities.
· Effective communication skills to collaborate with team members and stakeholders.
· Basic knowledge of data pipeline orchestration tools (e.g., Airflow).
· Understanding of cloud platforms (AWS, Azure, or GCP) is a plus.
What’s In It For You
· Competitive Benefit Package
· Paid Company Holidays, Paid Vacation, Volunteer Time, Summer Fridays & More!
· Learning & Development Opportunities
· Employee Resource Groups
· This list could vary based on location/region
“The hiring company may identify potential employee moves based on succession and/or development planning. All candidates need to apply through the formal bidding process”
Kenvue is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identify, age, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
Primary Location
Latin America-Brazil-São Paulo-São José dos CamposJob Function
Engineering (IT)* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow AWS Azure Computer Science Databricks Data pipelines Data quality Data Warehousing Engineering ETL GCP Pipelines Python Snowflake SQL
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.