Data Operations Engineer 100% (f/m/d)

Zurich

Julius Baer

Julius Baer is the international reference in wealth management, based on a solid Swiss heritage

View all jobs at Julius Baer

Apply now Apply later

At Julius Baer, we celebrate and value the individual qualities you bring, enabling you to be impactful, to be entrepreneurial, to be empowered, and to create value beyond wealth. Let’s shape the future of wealth management together.

As data platform data engineer you manage daily data operations based on data lake on-prem environment. You closely collaborate with the development team and taking care of the platform data related issues.

YOUR CHALLENGE

  • Datasets and data pipelines preparation, support for Business, data troubleshooting.
  • Closely collaborate with the Program Management and stakeholders
  • Develop data pipelines and structures for Data Analysts, testing such to ensure that they are fit for use
  • Maintain and model JSON based schemas and metadata to re-use it across the organization (with central tools)
  • Resolving and troubleshooting data related issues and queries
  • Covering all processes from enterprise reporting to data science (incl. ML Ops)

YOUR PROFILE

Competencies:

  • Hands-on Big Data experience using common components  (Hive, Spark, Presto/Trino, NiFi, MinIO, K8S, Kafka, S3 Storage)
  • Experience in stakeholder management in heterogeneous business/technology organizations
  • Experience in banking or financial business, with handling sensitive data across regions
  • Experience in large data migration projects with on-prem Data Lakes a plus
  • Hands-on experience in Data Science Workbench platforms (e.g. Knime, Cloudera, Dataiku)
  • Track record in Agile project management and methods (e.g., Scrum, SAFe)

Education and skills requirements

  • Knowledge of reference architectures, especially concerning integrated, data-driven landscapes and solutions
  • Expert SQL skills, preferably in mixed environments (i.e. classic DWH and distributed)
  • Working automation and troubleshooting experience in Python using Jupyter Notebooks or common IDEs
  • Data preparation for reporting/analytics and visualization tools (e.g Tableau, Power BI or Python based)
  • Applying a data quality framework within the architecture
  • Good knowledge of German is beneficial, excellent command of English is essential
  • Higher education (e.g.  “Fachhochschule”,“Wirtschaftsinformatik”)

We are looking forward to receiving your full job application through our online application tool. Further interesting job opportunities can be found on our Career site.

Is this not quite what you are looking for? Set up a job alert by creating a candidate account here.

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Architecture Banking Big Data DataOps Data pipelines Data quality JSON Jupyter Kafka KNIME Kubernetes Machine Learning NiFi Pipelines Power BI Python Scrum Spark SQL Tableau Testing

Perks/benefits: Career development

Region: Europe
Country: Switzerland

More jobs like this