CIFRE PhD

France, Ile-de-France, Hauts-de-Seine (92)

Applications have closed

Framatome

Framatome’s teams design and build nuclear power plants, present at every stage of the process on all types of reactor technologies.

View all jobs at Framatome

Informations générales

Entité légale

Chez Framatome, filiale d'EDF, nous concevons et fournissons des équipements, des services, du combustible, et des systèmes de contrôle-commande pour les centrales nucléaires du monde entier.
Nos 18 000 collaborateurs permettent chaque jour à nos clients de produire un mix énergétique bas-carbone toujours plus propre, plus sûr et plus économique.
Nos équipes développent également des solutions pour les secteurs de la défense, de la médecine nucléaire et du spatial.

Implantée dans une vingtaine de pays, Framatome rassemble les expertises d'hommes et de femmes passionnés et convaincus que le nucléaire est une énergie d'avenir.

Entreprise responsable, nous développons des actions pour former et accompagner les premières expériences professionnelles (label Happy Trainees), intégrer tous les talents, dont les personnes en situation de handicap, œuvrer pour l'égalité professionnelle et la mixité de nos métiers (94/100 à l'index de l'égalité hommes-femmes) et concilier les temps de vie.

Pour suivre notre actualité, retrouvez-nous sur www.framatome.com, LinkedIn, Instagram et X.  

Référence

2024-18684  

Date de parution

02/10/2024

Description du poste

Métier

TA - ETUDES - CONCEPTION & INGENIERIE - TAH - Conception du cœur et conduite de réacteur

Intitulé du poste

CIFRE PhD - Uncertainty Quantification in Neural Networks for Critical Applications F/H

Contrat

CIFRE

Fourchette de rémunération

salaryrange">de 35 à 40 k€

Description de la BU

dti

Description de la mission

This CIFRE PhD deals with uncertainty quantification in neural network for critical applications.

Neural networks are statistical learning models capable of processing complex data and learning non-linear relationships between inputs and outputs of interest.

However, neural networks, like all learning algorithms, can be subject to uncertainty, particularly in the presence of noisy data, incomplete data or data different from that used to train the algorithm. Uncertainty can manifest itself in a number of ways, including misclassification, erroneous predictions or low confidence scores.

Uncertainty quantification (UQ) is the process of estimating the uncertainty associated with a measurement or estimate. It is often used in the fields of statistical learning and data science, where it is important to take into account the uncertainty of data and models. UQ can therefore be applied to estimate the uncertainty of a neural network. This uncertainty can then be used to improve the robustness of the network to noisy and incomplete data, or to make more informed decisions.

Available techniques offer different ways of estimating uncertainty in neural networks, and the choice of method depends on the specific application and available resources. It is important to note that there are ongoing research efforts in this field, and new techniques may emerge in the future.  

Multiple approaches exist for estimating the uncertainty of neural network predictions. However, many questions remain unanswered if these methods are to be used for critical applications such as aeronautics, nuclear, medical, autonomous driving,etc., i.e. all applications that may have a consequent impact on human life regarding their robustness in particular. Indeed, it is important to assess both the quality of a model's predictions and to ensure that the model's prediction and uncertainty estimate are of high quality, i.e. it is necessary to know whether the prediction and uncertainty can be trusted. This question is all the more complicated when the algorithm is used with data far removed from the training base. We speak of distribution shift when the distribution of the new data is (partially) different from that of the training data, or Out Of Distribution when the new data has nothing to do with the training data. These cases arise when the algorithm is deployed for "real-world" scenarios which may differ from the framework built during the training phase.

Available scientific literature mentions a few avenues of research into uncertainty estimation in the context of non-distributed data (the conformal approach, for example). However, further research is still needed in the context of out-of-distribution data that is not known in advance, and for which current UQ methods are not very robust.

This PhD is funded by CIFRE, with a joint supervision between Framatome and CentraleSupélec (Centre for Visual Computing laboratory).

Profil

· Strong background in applied maths, data science, computer science, machine learning and deep learning, and skills in embedded systems and nuclear physics would be appreciated;

· Proficiency in programming languages such as Python or C/C++.;

· Good analytical and problem-solving skills, with a strong passion for discovery and cutting-edge research;

· Effective communication skills in English, and the ability to work both independently and as part of a multidisciplinary team;

· Experience with machine learning applied to physics and the PyTorch framework would be ideal.

Localisation du poste

Localisation du poste

France, Ile-de-France, Hauts-de-Seine (92)

Site

La Defense

BU

DTI - DTIP

Critères candidat

Niveau d'études min. requis

Bac+5

Niveau d'expérience min. requis

Jeune diplômé

Niveau d'emploi

Cadre

Informations additionnelles

Poste soumis à enquête administrative

Oui

Poste soumis à autorisation au titre du contrôle des exportations

Non

Job stats:  1  0  0

Tags: Autonomous Driving Computer Science Deep Learning Machine Learning PhD Physics Python PyTorch Research Statistics

Region: Europe
Country: France

More jobs like this