Data Engineer with French

Bucharest Orhideea, Romania

Thales

From Aerospace, Space, Defence to Security & Transportation, Thales helps its customers to create a safer world by giving them the tools they need to perform critical tasks

View all jobs at Thales

Apply now Apply later

Location: Bucharest, Romania

The people we all rely on to make the world go round – they rely on Thales.  Thales rely on its employees to invent the future: right here, right now.

Present in Romania for over 40 years, Thales is expanding its presence in the country by growing its Digital capabilities and by developing a Group Engineering Competence Centre (ECC). Operating from Bucharest, Thales delivers solutions in a number of core businesses, from ground transportation, space and defence, to security and aeronautics.
Several professional opportunities have arisen. If you are looking for the solidity of a Global Group that is at the forefront of innovation, but with the agility of a human structure that tailors to the personal development of its employees and allows opportunities for evolution in an international environment, then this is the place for you!

We are looking for Data Engineers with a high degree of autonomy and strong technical skills. They must be capable of understanding the execution of an existing project as well as developing based on our specifications. The project operates in a pseudo SAFe environment, employing agile practices. The existing teams are already working across multiple sites. Proficiency in French is essential.

The TSN team within this Big Data scope currently consists of approximately 15 people (+ BA and management support), under a customer project manager.

The Data Engineers will be responsible for performing a range of activities, including:

  • Participating in the design of data ingestion or valorization processes in PySpark based on business rules.
  • Designing scripts or encapsulating scripts in PySpark.
  • Implementing Oozie pipelines for scheduling processing steps.
  • Versioning their work using Git.
  • Mastering or at least being familiar with the principles of CI/CD delivery chain implementation.
  • Being knowledgeable about the principles and requirements associated with Big Data (component encapsulations, queue calls, Spark sessions, Big Data CI/CD, etc.).
  • Preparing and executing unit tests (UT), including the creation of data sets as needed, and utilizing ALM if they work with the tool.
  • Documenting their work through deliverables in GIT and/or Confluence and/or .doc/.xls formats

The Big Data clusters are Kerberized with HortonWorks distributions HDP 3.1.5 and HDF 3.5.1 (migration in progress to Cloudera). The main technical components used include:

  • Spark/Python
  • HDFS
  • Hive
  • HBase
  • Oozie
  • YARN
  • Atlas
  • Jenkins
  • Kerberos

Benefits

  • Training opportunities
  • 24 paid vacation days
  • Attractive salary
  • Medical care
  • Restaurant tickets
  • International environment

At Thales we provide CAREERS and not only jobs. With Thales employing 80,000 employees in 68 countries our mobility policy enables thousands of employees each year to develop their careers at home and abroad, in their existing areas of expertise or by branching out into new fields. Together we believe that embracing flexibility is a smarter way of working. Great journeys start here, apply now!
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Big Data CI/CD Confluence Engineering Git HBase HDFS Jenkins Oozie Pipelines PySpark Python Security Spark

Perks/benefits: Career development

Region: Europe
Country: Romania

More jobs like this