Senior Data Engineer

Kraków, Lesser Poland Voivodeship, Poland - Remote

Softeta

From top-tier IT consultancy to cutting-edge custom software development, we provide the expertise to fuel your growth.

View all jobs at Softeta

Apply now Apply later

We’re looking for an experienced data engineer with strong skills in overhauling and maintaining a large data infrastructure.

You will build the large datasets that empower our data scientists, analysts and less technical stakeholders. You will create new and overhaul existing data pipelines and tables so that they are stable, accurate and well monitored. You will challenge ‘table creep’ and push data users in the business towards a single source of truth, and you will make adjustments or new datasets that are understandable and ergonomic for data scientists and analysts.

You will help us make our data infrastructure elegant, robust, stable and well-maintained. You will spar with other data engineers on the best path for this - your opinion and experience matters.

Responsibilities:

  • Create and maintain pipeline architectures in AirFlow and DBT;
  • Assemble large and / or complex datasets for business requirements;
  • Improve our own processes and infrastructure for scale, delivery and automation;
  • Maintain and improve our data warehouse structure so that it is fit for purpose;
  • Adjust methods, queries and techniques to suit our very large data environment;
  • Adopt best-practice coding and review processes;
  • Communicate technical details and edge cases in the data to specialist and non-specialist stakeholders;
  • Notice, investigate, resolve and communicate about anomalies in the data;
  • Develop and maintain brief, relevant documentation for data products.

Requirements

  • Advanced Degree in computer science, mathematics, data science or other related fields;
  • Proven work experience, at least 5 years of working as a data engineer;
  • Proficiency with AirFlow, DBT, DataFlow or similar products;
  • Strong knowledge of data structures and data modeling;
  • CI/CD pipeline and MLOPs experience very advantageous;
  • Experience with very large data sets is advantageous;
  • Experience with cloud data platforms is essential, specific experience with GCP / BigQuery is advantageous;
  • Good communication and presentation skills;
  • Strong collaboration skills;
  • Ability to work independently and as part of a team.

Benefits

  • Diverse and technically challenging projects;
  • Flexible working hours and hybrid or remote workplace model;
  • Flexible schedule and Agile/SCRUM environment;
  • Technical equipment which you can choose.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow Architecture BigQuery CI/CD Computer Science Dataflow Data pipelines Data warehouse dbt GCP Mathematics MLOps Pipelines Scrum

Perks/benefits: Flex hours

Regions: Remote/Anywhere Europe
Country: Poland

More jobs like this