Senior Data Engineer - Advertising

Warsaw, Poland

Allegro

Allegro - Najlepsze ceny oraz gwarancja bezpiecznych zakupów!

View all jobs at Allegro

Apply now Apply later

Job Description

About the Team

Allegro is among the largest and most advanced Tech companies in Poland. The Allegro Advertising business has been growing approximately twice as fast as the Allegro platform and about three times faster than the advertising market in Poland over the past few years. At Allegro, we are shaping the future of e-commerce by offering not only cutting-edge technologies but also an exceptional environment for professional growth - spanning both technical and soft skills.

The Brand-First Analytics team, operating within Advertising Analytics, sits at the intersection of technology and business. It supports strategic advertising initiatives by providing data-driven insights, analyses, tools, and unique analytical solutions for brands advertising on Allegro.

What will your work involve?

In this role, you will be responsible for developing and optimizing large-scale data processing workflows that support strategic Brand-First Advertising initiatives at Allegro. Your tasks will include:

  • Building, managing, and improving data processing pipelines - You will ensure the efficiency and scalability of analytical solutions by independently designing, managing, and optimizing processes for handling large volumes of data.

  • Close collaboration with Data Analysts (Mid/Senior/Expert) and Data Engineers - You will work daily with experienced professionals, co-developing advanced analytical products for brands advertising on Allegro as well as supporting internal needs of the Advertising Analytics Team.

  • Developing data architecture - You will design, expand, and refine the data architecture that powers analytical products.

  • Monitoring data quality and consistency - You will be responsible for maintaining data integrity, reliability, and effectiveness by continuously monitoring and improving data quality and consistency.

  • Optimizing data processing costs in Google Cloud Platform - You will help manage cloud resources efficiently and optimize costs associated with data processing workflows.

  • Automating and scaling analytical processes - You will enhance operational efficiency by developing solutions that optimize workflows for both our team and key stakeholders.

  • Supporting the growth of Advertising BU initiatives - You will work closely with business teams including Sales, Strategy, Special Projects, and Brand & Display Business Project Management.

  • Collaborating with cross-functional teams - Your analytical solutions will play a key role in shaping advertising strategies for brands and improving data analysis workflows.

  • Senior-level mentoring and support for Data Engineers in Advertising - You will drive best practices, ensure strategic alignment of solutions, and facilitate knowledge sharing within the team, supporting competency development and enhancing your leadership skills.

  • Engaging in company-wide analytics scaling initiatives - You will have the opportunity to make an impact and contribute to the growth and optimization of analytics processes across the organization.

Why should you work with us?

  • Big Data is not an empty buzzword here, it’s our reality - several petabytes of data. You will work on truly large datasets.

  • We use the latest and best available technologies, such as PySpark, Airflow, and their implementation on Google Cloud Platform.

  • You will have a real impact on the development of data processing processes, with great independence in the selection of solutions and their architecture.

  • You will have the opportunity to collaborate with Data Engineers from within and outside the team who are eager to share their knowledge.

  • Your solutions will be essential for the development of data processing within Brand Advertising at Allegro.

  • You will design solutions that enable the optimization and scaling of data processing, driving operational efficiency.

  • You will enjoy significant freedom in selecting solutions and designing the architecture of data processing processes.

  • We are a supportive team that values a positive work environment and strong collaboration, fostering a spirit of teamwork and unity.

What we offer:

  • A hybrid work model. Well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)

  • A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)

  • English classes that we pay for related to the specific nature of your job

  • Work in a team you can always count on – we have on board top-class specialists and experts in their areas 

  • Training budget and an internal educational platform, MindUp (with training courses on work organization, means of communication, motivation to work and various technologies and substantive issues)

  • If you want to learn more, check it out for yourself

We are looking for people who:

  • Have a minimum of 5 years of experience in the role of Data Engineer, processing and analyzing large datasets (SQL, Python, PySpark, Airflow), with a strong focus on designing and optimizing ETL processes.

  • Have strong SQL skills and can optimize queries for both performance and cost efficiency.

  • Are proficient in Python, effectively process data using PySpark and Airflow, and follow best practices in data engineering.

  • Have experience working in cloud environments (GCP, AWS, or Azure) and understand database algorithms and structures.

  • Can independently manage their scope of work, propose improvements, and successfully plan and execute projects in line with the roadmap.

  • Take full ownership of meeting deadlines and achieving business objectives, while effectively planning and estimating the time required to complete tasks and managing priorities.

  • Collaborate effectively with various stakeholders and analysts, ensuring impactful solutions while fostering team growth.

  • Seek opportunities to optimize processes and propose innovative solutions that increase operational efficiency.

An additional advantage will be:

  • Domain knowledge in advertising and marketing (especially Retail Media & E-Commerce) and experience working on the brand side in industries such as FMCG, Retail, Finance & Banking, or Telecom.

  • Familiarity with basic ML concepts and the ability to prepare data pipelines for machine learning models.

  • Experience with DBT, Data Quality Systems, and Docker.

Apply to Allegro and see why it is #dobrzetubyć (#goodtobehere)

 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS Azure Banking Big Data Data analysis Data pipelines Data quality dbt Docker E-commerce Engineering ETL Finance GCP Google Cloud Machine Learning ML models Pipelines PySpark Python SQL

Perks/benefits: Career development Lunch / meals

Region: Europe
Country: Poland

More jobs like this