Data Engineer I
Bellevue, WA
Our Opportunity:
Chewy is looking for a Data Engineer I to join our growing Transportation Systems team. You will get to work side by side with hard-working, passionate, and motivated engineers. You will have the opportunity to gain hands-on experience in the field of data engineering, working on different projects building data pipelines, ETL processes, and data warehouse management. The ideal candidate should have a strong interest in building and maintaining cloud databases, ingesting data using a variety of methods (including non-SQL technologies like SOAP and REST) and working on joining datasets from different cloud-based source systems in a centralized database. If you are equally as passionate about supply chain, transportation, management information systems, e-commerce, and career growth, an opportunity at Chewy may be a match!
What You'll Do:
- Develop and Maintain data pipelines to extract, transform, and load (ETL) data from various sources into our data lake.
- Configure custom data pipelines within Snowflake/AWS/Databricks for ingestion
- Maintain real-time alarming and debugging tools
- Design and implement solutions on a cloud platform using Infrastructure as code (Terraform)
- Maintain, support, and develop within the Supply Chain - Transportation Data Mart Snowflake instance, including code build/review, auditing, performance tuning, and security.
- Create and maintain technical user documentation and models for the Data Mart
What You'll Need:
- Bachelor of Science or equivalent experience in Computer Science, Engineering, Information Systems or related field.
- Excellent verbal and written communication skills and the ability to explain details of complex concepts to non-expert partners in a simple and understandable way.
- Strong knowledge of SQL and relational databases
- Python programming skills for data processing and pipeline development
- Experience with ETL (Extract, Transform, Load) processes
- Basic knowledge of data warehousing concepts
- Familiarity with cloud platforms (especially AWS services like S3, Lambda, Airflow)
- Version control experience (Git)
- Current permanent U.S. work authorization is required
Bonus:
- Experience of translating ambiguous customer requirements into clear problem definitions and delivering them.
- Design and execution experience of analytical projects.
- Experience with Python data libraries (pandas, numpy)
- Some exposure to big data technologies (Hadoop, Spark)
- Understanding of data modeling
Chewy is committed to equal opportunity. We value and embrace diversity and inclusion of all Team Members. If you have a disability under the Americans with Disabilities Act or similar law, and you need an accommodation during the application process or to perform these job requirements, or if you need a religious accommodation, please contact CAAR@chewy.com.
If you have a question regarding your application, please contact HR@chewy.com.
To access Chewy's Customer Privacy Policy, please click here. To access Chewy's California CPRA Job Applicant Privacy Policy, please click here.
Tags: Airflow AWS Big Data Computer Science Databricks Data pipelines Data warehouse Data Warehousing E-commerce Engineering ETL Git Hadoop Lambda NumPy Pandas Pipelines Privacy Python RDBMS Security Snowflake Spark SQL Terraform
Perks/benefits: 401(k) matching Career development Equity / stock options Flex hours Flex vacation Health care Insurance Medical leave Parental leave Salary bonus Unlimited paid time off
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.