Data Engineer

Warsaw, Mazowieckie, Poland

capital.com

Die Investment-App für clevere Anleger. Finanzinstrumente, Online-Handel mit CFDs, Finanzkurse und Anlagemöglichkeiten – alles buchstäblich in einer Hand.

View all jobs at capital.com

Apply now Apply later

We are a leading trading platform that is ambitiously expanding to the four corners of the globe. Our top-rated products have won prestigious industry awards for their cutting-edge technology and seamless client experience. We deliver only the best, so we are always in search of the best people to join our ever-growing talent team. 
As a Data Engineer, you will be responsible for designing, implementing, and optimizing batch and streaming data pipelines that ensure efficient and reliable data flow throughout the company. You will support our existing data infrastructure while also extending the toolset with new technologies. You’ll collaborate closely with different teams to propose improvements and innovations that enhance data accessibility and performance.

Responsibilities:

  • Data Pipeline Development: Build, implement, and maintain batch and streaming data pipelines that support the company’s data requirements.
  • Toolset Extension: Continuously expand the data engineering toolkit by integrating new technologies and methodologies that improve performance and scalability.
  • Stack Support: Provide support and maintenance for the current data infrastructure, ensuring that all systems remain operational and efficient.
  • Continuous Improvement: Propose and implement new technologies and improvements to optimize the data engineering process.
  • Collaboration: Work with cross-functional teams to gather requirements and translate them into technical specifications for the data pipeline.

Requirements:

  • 2+ years of experience in data engineering, focusing on building and maintaining data pipelines.
  • Strong expertise in Python and working with data orchestration tools like Airflow.
  • Experience with SQL databases such as PostgreSQL and Redshift.
  • Familiarity with AWS cloud services and container orchestration tools like Docker and Kubernetes.
  • Experience with Spark is a plus.
  • Ability to collaborate with various teams and propose innovative solutions.

What you get in return:

  • You will join the company, that cares about work and life balance
  • Annual Bonus based on the performance review cycle
  • Generous Annual Leave Policy
  • Medical Insurance and Pension fund, with additional benefit packages based on the location
  • Hybrid working model (3 days from our modern office and 2 days fully remotely)
  • Comprehensive Workation Policy with 30 more remote days available.
  • Possibility of taking two additional days of paid leave per year to dedicate to volunteering efforts. 
Be a key player at the forefront of the digital assets movement, propelling your career to new heights!Join a dynamic and rapidly expanding company that values and rewards talent, initiative, and creativity. Work alongside one of the most brilliant teams in the industry.
Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Airflow AWS Data pipelines Docker Engineering Kubernetes Pipelines PostgreSQL Python Redshift Spark SQL Streaming

Perks/benefits: Career development Medical leave Salary bonus

Regions: Remote/Anywhere Europe
Country: Poland

More jobs like this