Mid / Senior Data Engineer - Delivery Experience

Warsaw, Poland

Allegro

Allegro - Najlepsze ceny oraz gwarancja bezpiecznych zakupów!

View all jobs at Allegro

Apply now Apply later

Job Description

The salary range for this position is (contract of employment):

Mid role: 14 200 - 19 690 PLN in gross terms

Senior role: 18 400 - 25 410 PLN in gross terms

A hybrid work model requires 1 day a week in the office 

In the area of Delivery Experience, we are building technology that makes Allegro's deliveries easy, cost-effective, fast and predictable. Our team takes care of critical services along the Allegro shopping journey, responsible for predicting delivery times using statistical algorithms and machine learning, selecting the best delivery methods tailored to customers, and integrating with carrier companies. Delivery Experience is also one of the fastest-growing areas where we undertake new, complex projects to enhance logistics and warehousing processes.

We are looking for a Mid/Senior Data Engineer with a focus on the data processing and preparation, deployment and maintenance of our data projects. Join our team to enhance your skills related to deploying data-based processes, data ops  approaches and share the skills within the team.

We are looking for people who:

  • Have at least 3 years of experience as Data Engineer and working with large datasets

  • Have experience with cloud providers (GCP preferred)

  • Are highly proficient in SQL

  • Have strong understanding of data modeling and cloud DWH architecture

  • Have experience in designing and maintaining ETL/ELT processes

  • Are capable of optimizing cost and efficiency of data processing

  • Are proficient in Python for working with large data sets (using PySpark or Airflow)

  • Use good practices (clean code, code review, CI/CD)

  • Have a high degree of autonomy and take responsibility for developed solutions 

  • Have English proficiency on at least B2 level

  • Like to share knowledge with other team members

Nice to have:

  • Experience with Azure and cross-cloud data transfers and multi-cloud architecture

What will your responsibilities be?

  • You will be actively responsible for developing and maintaining processes for handling large volumes of data

  • You will be streamlining and developing the data architecture that powers analytical products and work along a team of experienced analysts

  • You will be monitoring and enhancing quality and integrity of the data

  • You will manage and optimize costs related to our data infrastructure and data processing on GCP

What we offer

  • A hybrid work model that you will agree on with your leader and the team. 

  • We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)

  • A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)

  • 16" or 14" MacBook Pro with M1 processor and, 32GB RAM or a corresponding Dell with Windows (if you don’t like Macs) and other gadgets that you may need

  • Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)

  • English classes that we pay for related to the specific nature of your job

Why would you like to work with us:

  • Big Data is not an empty slogan for us, but a reality - you will be working on really big datasets (petabytes of data)

  • You will have a real impact on the direction of product development and technology choices. We utilize the latest and best available technologies, as we select them according to our own needs

  • Our tech stack includes: GCP, BigQuery, (Py)Spark, Airflow

  • We are a close-knit team where we work well together

  • You will have the opportunity to work within a team of experienced engineers and big data specialists who are eager to share their knowledge, including publicly through allegro.tech

Apply to Allegro and see why it is #dobrzetubyć (#goodtobehere)

Apply now Apply later
Job stats:  1  0  0
Category: Engineering Jobs

Tags: Airflow Architecture Azure Big Data BigQuery CI/CD DataOps ELT ETL GCP Machine Learning PySpark Python Spark SQL Statistics

Perks/benefits: Career development Gear Lunch / meals

Region: Europe
Country: Poland

More jobs like this