Senior Data Engineer 80-100% (f/m/d)

Bendern (EBH)

LGT

For more than 100 years, LGT has been the global private bank for wealthy clients who want to invest in a forward looking and sustainable manner.

View all jobs at LGT

Apply now Apply later

LGT Private Banking is a leading international private bank and independent family-owned business. Our thinking is long-term, and we strive to innovate. In recent years, we have grown strongly - in Europe and in Asia. To ensure we remain successful, one factor is crucial: digitalization. It is a key strategic priority in the coming years. We want to combine the best of the analog and digital worlds, in interdisciplinary and agile teams. Our goal: to create a truly personal customer experience thanks to state-of-the-art services.

Job Description

In the course of our strategic corporate development and Data Management & Analytics initiative, we are looking for a committed and talented Data Engineer (f/m), who is passionate about designing, developing and optimizing LGT’s Data Management and Analytics Platform as core capability for LGTs digital transformation.

You will work in the department of Data Analytics and Information Management. Our mission is to make data, information and insights useful for all areas of LGT – connected, secure and sustainable. From this department we drive the strategic direction for data, information, business intelligence and artificial intelligence throughout LGT. We design, develop and maintain data-driven solutions and provide insights to our internal and external clients leveraging the potential of data and new technologies for the good of LGT and our clients.

Together in a young, highly motivated team and in cooperation with internal and external partners, you will help to build the promising data infrastructure and knowledge base for intelligent data and information management, reporting services and AI solutions based on LGTs Data Platform.

What are your duties?

In this position, you will take on the following tasks:

  • Work with Product Owners, Data & IT Architects, Data Engineers and Data Scientists to architect, design, develop, implement, and deploy data solutions.
  • Design, development, operation and maintenance of interfaces and data pipelines.
  • Research, evaluate, and recommend process improvements, including automated systems for knowledge capture, transformation, and presentation of information.
  • Assist with the definition of, integration, mapping, migration, and conversion strategies for new and existing knowledge content and data sources, including robust change control and versioning procedures.
  • Ensuring Data Quality.
  • Support for goal-oriented and efficient controlling and reporting.
  • Produce and maintain documentation for developed solutions.

Requirements

  • Bachelor’s degree or higher education in computer science, data science, software engineering or a comparable discipline.
  • 5 years of experience in data engineering
  • Strong knowledge of building graph-based solutions and data services, preferably in the banking environment.
  • Sound knowledge of application, data & infrastructure architecture and design patterns, business analysis, software development, maintenance and improvement.
  • Ability to produce scalable and robust production-quality code incorporating testing, evaluation, and monitoring.
  • Advanced knowledge of data format standards and database management systems (e.g. Oracle, Graph Databases).
  • Advanced knowledge in configuration management tools (e.g. Helm, Ansible) and scripting such as Python and/or Bash
  • Ability to work independently and take ownership and accountability on assigned tasks driving execution through completion.
  • Excellent problem-solving skills and an analytical aptitude with the ability to apply critical judgment coupled with sound analysis and troubleshooting skills.
  • Strong written and verbal communication skills.
  • Of advantage are:
    • Experience in program / project management, preferably in agile methods.
    • Understanding authentication methods and security concepts
    • Experience in PL/SQL
    • Knowledge in streaming technologies such as Kafka
    • Knowledge in containerization technologies such as Docker and Kubernetes
    • Avaloq knowledge is a plus
    • Knowledge of SPARQL or Cypher
    • Knowledge of Extract/Transform/Load (ETL) solutions

We kindly ask you to take into account that we cannot consider applications via recruitment agencies for this position.

Contact Information

We are looking forward to receiving your online application.

For any further information please do not hesitate to contact us.

LGT Financial Services AG

Human Resources

Corina Hohl
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  3  0  0
Category: Engineering Jobs

Tags: Agile Ansible Architecture Banking Business Intelligence Computer Science CX Data Analytics Data management Data pipelines Data quality Docker Engineering ETL Helm Kafka Kubernetes Oracle Pipelines Python Research Security SQL Streaming Testing

Perks/benefits: Career development Startup environment

Region: Europe
Country: Liechtenstein

More jobs like this