Data Engineer

GCC, India

Advance Auto Parts

Advance Auto Parts is your source for quality auto parts, advice and accessories. View car care tips, shop online for home delivery, or pick up in one of our 4000 convenient store locations in 30 minutes or less.

View all jobs at Advance Auto Parts

Apply now Apply later

Job Description

WHO WE ARE

Come join our Technology Team and start reimagining the future of the automotive aftermarket. We are a highly motivated tech-focused organization, excited to be amid dynamic innovation and transformational change. Driven by Advance’s top-down commitment to empowering our team members, we are focused on delighting our Customers with Care and Speed, through delivery of world class technology solutions and products.

We value and cultivate our culture by seeking to always be collaborative, intellectually curious, fun, open, and diverse.   You will be a key member of a growing and passionate group focused on collaborating across business and technology resources to drive forward key programs and projects building enterprise capabilities across Advance Auto Parts.

Job Summary:

We seek a highly skilled Data Engineer to shape and lead our data collection, modeling, and analytics infrastructure. With expertise in building cloud native data solutions in event driven pattern, you will design a robust data ecosystem that empowers business insights and decision-making.

Responsibilities & Duties:

  • Seeking candidates with passion in data Engineering field.

  • Be an advocate with obsession for excellence delivering measurable success for Advance Auto Parts stakeholders with secure, scalable, highly available cloud architecture that leverages AWS Cloud services.

  • Provides technical guidance in AWS cloud tech stack, Snowflake, Databricks and any ETL tool like Talend or any orchestration tool.

  • Extensive hands-on expertise in Pyspark, snowflake, Airflow and RDS(Postgress).

  • Capable of conceptualizing, prototyping, and presenting appropriate solutions to leadership for effective decision-making.

  • Ability to assess the current environment, requirements and propose Data Engineering Solutions.

  • Ability to build frameworks and pipelines using PySpark and Palantir foundry.

  • Skilled in crafting comprehensive solutions across AWS or any other cloud platforms, with demonstrated hands-on experience in design and implementation.

  • Proficiency in delivering high-quality data engineering services is a must have for this role.

  • Lead the entire lifecycle of project delivery, from development to production, and provide post-production warranty support.

  • Able to work with platform team in problem solving and troubleshooting activities.

  • Propose Standard and Best Practices.

  • Apply creative thinking/approach to arrive at technical solutions that enhance business goals and align with corporate technology strategies.

  • Help Application teams in onboarding the feeds into the built-in frameworks.

  • Make sure to follow the best design practices during the implementation of ETL Frameworks.

  • Demonstrate expertise in developing pipelines within Foundry,Databricks and Airflow frameworks, ensuring meticulous attention to detail in establishing efficient monitoring, logging, and auditing capabilities.

  • Capable of cultivating effective collaboration with technical teams, delivery managers, as well as data engineers, scientists, and analysts.

Qualification:

  • Bachelor’s or master’s degree in computer science or equivalent experience

  • Experience in any cloud Solutions. Retail Industry experience preferred.

  • Implement data transformations and data structures for data warehouse and lake/repository.

  • Experience in creating Various ETL Frameworks in processing or extracting the data from databases by using Palantir foundry , Snowflake and AWS Services

  • Having certifications in AWS, Azure, Snowflake, and Databricks is an added advantage for this position.

  • Proficiency in Git, and DevOps pipelines is a valuable asset for this position.

  • Proficient in Python scripting for automation, shell scripting for system tasks, and containerization technologies.

  • Proven ability to be a strategic thinker to drive the necessary ownership and data governance within the organization.

  • Ability to evaluate risks and provide business recommendations / solutions in a timely manner

  • Able to scan ETL Frameworks and propose optimizations and cost savings.

  • In addition to technical proficiency, the candidate should possess the ability to mentor junior team members, providing guidance and support to foster their professional growth and development within the team.

  • Understanding of and experience working in an Agile environment.

  • Able to work with other cross functional team members, both onshore and offshore for solutions and development tasks

  • Independent. Strong critical thinking, decision making, troubleshooting and problem-solving skills.

  • Go-Getter. Possesses strong, planning, execution and multitasking skills and demonstrated ability to reprioritize accordingly. Must be able to manage quickly changing priorities while meeting deadlines.

California Residents click below for Privacy Notice:

https://jobs.advanceautoparts.com/us/en/disclosures
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow Architecture AWS Azure Computer Science Databricks Data governance Data warehouse DevOps Engineering ETL Git Pipelines Privacy Prototyping PySpark Python Shell scripting Snowflake Talend

Perks/benefits: Team events

Region: Asia/Pacific
Country: India

More jobs like this