Data Engineer (Romania)

Remote job

Applications have closed

Xebia

Leading global technology consultancy providing strategy, software engineering, advanced training, and managed services to help businesses thrive in the AI-enabled digital era.

View all jobs at Xebia

Hello, let’s meet!

We are Xebia - a place where experts grow. For nearly two decades now, we've been developing digital solutions for clients from many industries and places across the globe. Among the brands we’ve worked with are UPS, McLaren, Aviva, Deloitte, and many, many more.

We're passionate about Cloud-based solutions. So much so, that we have a partnership with three of the largest Cloud providers in the business – Amazon Web Services (AWS), Microsoft Azure & Google Cloud Platform (GCP). We even became the first AWS Premier Consulting Partner in Poland.

Formerly we were known as PGS Software. In 2021, we joined Xebia Group – a family of interlinked companies driven by the desire to make a difference in the world of technology.

Xebia stands for innovation, talented team members, and technological excellence. Xebia means worldwide recognition, and thought leadership. This regularly provides us with the opportunity to work on global, innovative projects.

Our mission can be captured in one word: Authority. We want to be recognized as the authority in our field of expertise.

What makes us stand out? It's the little details, like our attitude, dedication to knowledge, and the belief in people's potential - emphasizing every team members development. Obviously, these things are not easy to present on paper – so make sure to visit us to see it with your own eyes!

Now, we've talked a lot about ourselves – but we'd love to hear more about you.

Send us your resume to start the conversation and join the #Xebia.

You will be:

  • Working with teams of a globally recognized American apparel brand, symbol of rugged individuality and casual style,
  • Responsible for at-scale infrastructure design, build and deployment with a focus on distributed systems,
  • Building and maintaining architecture patterns for data processing, workflow definitions, and system to system integrations using Big Data and Cloud technologies,
  • Evaluating and translating technical design to workable technical solutions/code and technical specifications at par with industry standards,
  • Driving creation of re-usable artifacts,
  • Establishing scalable, efficient, automated processes for data analysis, data model development, validation, and implementation,
  • Working closely with analysts/data scientists to understand impact to the downstream data models,
  • Writing efficient and well-organized software to ship products in an iterative, continual release environment,
  • Contributing and promoting good software engineering practices across the team,
  • Communicating clearly and effectively to technical and non-technical audiences,
  • Defining data retention policies,
  • Monitoring performance and advising any necessary infrastructure changes,
  • Responsible for dashboard development (Tableau, PowerBi, Qlik, etc),
  • Responsible for data analytics model development (R, Python, Spark).

Requirements

  • 5+ years’ experience as a software developer/data engineer,
  • Big Data technologies and AI/ML Life cycle,
  • University or advanced degree in engineering, computer science, mathematics, or a related field,
  • Strong hands-on experience in Databricks using PySpark and Spark SQL (Unity Catalog, workflows, Optimization techniques),
  • Experience with at least one cloud provider solution - Azure, AWS, GCP (preferred),
  • Strong experience working with relational SQL databases,
  • Strong experience with object-oriented/object function scripting language: Python,
  • Working knowledge in any transformation tools (DBT preferred),
  • Ability to work with Linux platform,
  • Strong knowledge of data pipeline and workflow management tools (Airflow preferred),
  • Working knowledge of Git hub /Git Toolkit,
  • Expertise in standard software engineering methodology, e.g. unit testing, code reviews, design documentation,
  • Experience creating Data pipelines that prepare data for ingestion & consumption appropriately,
  • Experience in maintaining and optimizing databases/filesystems for production usage in reporting, analytics,
  • Working in a collaborative environment and interacting effectively with technical and non-technical team members equally well,
  • Good verbal and written communication skills (English),
  • Experience with ecommerce, retail or supply chains is welcome,
  • Cooperation with US west-coast based teams is part of the game – up to 2x in a week overlap with 9am PDT (18:00 CET),

Work from the European Union region and a work permit are required.

Recruitment Process

CV review – HR Call – Interview Client Interview - Decision

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  1  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS Azure Big Data Computer Science Consulting Data analysis Data Analytics Databricks Data pipelines dbt Distributed Systems E-commerce Engineering GCP Git Google Cloud Linux Machine Learning Mathematics ML models Pipelines Power BI PySpark Python Qlik R Spark SQL Tableau Testing

Perks/benefits: Startup environment

Region: Remote/Anywhere

More jobs like this