Data Engineer III - Rx Systems
Louisville, KY
Our Opportunity:
Chewy’s Pharmacy Operations Team is looking for a Data Engineer III to join the pack! In this role, you will demonstrate your extensive expertise in data engineering to build and maintain high-quality data pipelines that enable development of insights, data visualization and models to drive operational and business improvements. To deliver this, you will develop and implement advanced data solutions and technologies that scale to meet the needs of our evolving business. Come join a team dedicated to redefining pharmacy operations, where your work will directly influence strategic decisions and customer experiences!
What You'll Do:
- Implement the strategy, design, execution, and configuration of our evolving data stack including customer and pet data, order transactions, operational floow data, prescription records, among other complex data sets.
- Lead the evaluation, implementation, and deployment of emerging tools and technologies to improve our productivity as a team.
- Build and monitor data pipelines for accuracy, missing data, enhancements, changes and volumes to ensure all data is captured and processed accurately and when needed.
- Work with cross-functional stakeholders in defining and documenting requirements for building high-quality and impactful data products
- Reconcile data issues and alerts between various systems, finding opportunities to innovate and drive improvements that continuously improve data quality
- Develop and deliver communication and education plans on data engineering capabilities, standards, and processes.
- Code, test, and document new or modified data systems to create robust and scalable applications for reporting and data analytics.
- Own and document data pipelines, monitoring, data accuracy and data lineage
What You'll Need:
- Bachelor’s degree in MIS, Computer Science, Computer Engineering, or relevant fields
- 3+ years of proven experience in data warehousing, modeling, and ETL pipeline development along with proficient understanding of dimensional and relational database architecture.
- 2+ years of professional scripting using Python to automate data workflows, system monitoring, and continuously optimize processes.
- Advanced technical experience using SQL in a cloud environment (Snowflake preferred)
- Proficiency in building and optimizing ETL pipelines using AWS Glue, Control M, Pyspark, DBT and other applications.
- Experience with setting up end-to-end data pipelines for new and/or changing businesses in an enterprise environment.
- Hands-on with Cloud computing technology like AWS, Google Cloud, etc.
- Experience with developing solutions for cloud computing services and infrastructure with AWS (S3, Athena, Glue, Lambda)
- Familiarity with Tableau, Looker, or similar visualization/business intelligence platform
- Proven ability to work collaboratively with data scientists, analysts, and business stakeholders to gather requirements and deliver impactful data solutions.
- Ability to effectively operate both independently and as part of a team.
- Self-motivated with strong problem-solving and self-learning skills.
- The position may require travel
Bonus:
- Master's degree in Computer Science, Data Science, or related field
- Experience with dbt for transformation and testing in the ELT process.
- Proficiency using Apache Airflow or other DAG frameworks
- Expertise in crafting and implementing data pipelines using multiple modern data engineering approaches and tools: Spark, PySpark, Java, Docker, cloud-native DWH (Snowflake, Redshift), Kafka/Confluence, etc.
- Experience with CI/CD processes and platforms.
- Experience with Oracle OSvC data structures.
- Experience crafting APIs or data services to expose data to downstream applications.
Chewy is committed to equal opportunity. We value and embrace diversity and inclusion of all Team Members. If you have a disability under the Americans with Disabilities Act or similar law, and you need an accommodation during the application process or to perform these job requirements, or if you need a religious accommodation, please contact CAAR@chewy.com.
If you have a question regarding your application, please contact HR@chewy.com.
To access Chewy's Customer Privacy Policy, please click here. To access Chewy's California CPRA Job Applicant Privacy Policy, please click here.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture Athena AWS AWS Glue Business Intelligence CI/CD Computer Science Confluence Data Analytics Data pipelines Data quality Data visualization Data Warehousing dbt Docker ELT Engineering ETL GCP Google Cloud Java Kafka Lambda Looker Oracle Pipelines Privacy PySpark Python RDBMS Redshift Snowflake Spark SQL Tableau Testing
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.