Data Engineer

Budapest, Hungary

Apply now Apply later

Job Title

Data Engineer

Job Description Summary

The Data Engineer is part of the team responsible for building, maintaining, and optimizing the global BI and Analytics platform that supports all GOS service lines. In addition to helping design, implementing, and supporting the data artifacts in the database and data lake they will also develop, test, and maintain data extract, load, and transformation (ELT/ETL) pipelines and data flows to ingest data from multiple heterogeneous sources. They will also use their analysis, design, and development skills to create and enhance data ingestion processes and in the preparation of data sets for benchmarking, trend analysis and data science.

Job Description

Responsibilities:

  • Support the technical leads to build, implement, and maintain the technical data infrastructure that is the foundation of the BI platform and to support global data strategy initiatives,
  • Develop, test, and maintain Azure Data Factory pipelines and data flows to manage the extract, transform, and load of data from a variety of data sources to the data lake and data warehouse,
  • Use T-SQL to build, maintain, and enhance database objects (tables, views, procedures, etc.) in Azure SQL Database, helping to develop and enhance the Triana data warehouse as part of planned and ad-hoc development activities,
  • Use Azure Databricks for complex data cleansing, preparation, and analysis to meet the functional and non-functional needs of the business,
  • Work closely with Business and Data Analysts to understand and document source-to-target mapping specifications and to build data ingestion flows to the specified requirements,
  • Help to develop and enhance the Triana data warehouse as part of planned and ad-hoc development activities,
  • Contribute to high-/low-level designs and technical specifications of existing and new data
  • integration processes to the required business and technical standard,
  • Support data science and other experimental initiatives by using the appropriate tool set including Databricks, ADF, Python and SQL to wrangle and prepare large and/or complex structured andsemi-structured data,
  • Complex data analysis, exploring ways to improve data quality and extract value from raw data,
  • Work with Product Owners, Analysts and Account Teams to assist with data-related technical issues and propose suitable solutions, both short and long term,
  • Provide both verbal and written updates to management on the progress of work,
  • Promote effective and consistent communication – both within and outside of the team,
  • Work closely with a multi-disciplinary team in an Agile / Scrum environment ensuring quality deliverable end of each sprint,
  • Contribute to a mindset of innovation & constant improvement for best practices and business standards in data, analytics and development, helping to turn raw data into actionable insights,

Knowledge & Experience:

  • In-depth experience (2-3 years) on at least one of the following technologies with broad practical
  • experience (1+ years) with at least one other technology on this list:
    • Database design & development in Azure SQL Database and/or SQL Server 2019+,
    • Developing data integration processes using Azure Data Factory,
    • Preparing and wrangling data using Databricks (ideally) or similar e.g. pyspark notebooks in Synapse, or Jupyter Notebooks,
    • Experience in at least one language commonly used in data cleansing, wrangling and statistical analysis: Python, Scala, R, etc.
  • Good knowledge of using T-SQL to query data, build reports, troubleshoot common data issues and summarise/aggregate data to derive new insights,
  • Good understanding of the principles of data warehouse design and data integration processes to extract, transform, cleanse and prepare data for BI reporting and advanced analytics,
  • Routine use of source control (Git, SVN, TFS, etc) as part of regular development activities,
  • Experience gathering, analysing, and documenting system requirements and technical specifications for new requirements, and also reverse engineering existing solution,
  • Proven experience troubleshooting and resolving data and code issues. Debugging, identifying bugs and driving them to closure by working closely with the rest of the team,
  • Able to demonstrate strong communication skills documenting business processes in the form of process maps and step-by-step user manuals. data analysis, definition of workarounds, etc,

Nice to have

  • Test-driven development using common testing frameworks (pytest, nUnit, tSQLt, etc.)
  • Experience of migrations-based Database Lifecycle Management e.g. Flyway, Liquibase etc.
  • An understanding of software engineering principles including design patterns; proper use of version control; branching & merging; partially or fully automated release builds, CI/CD etc.
  • Experience with Data Lake House technologies including Delta Tables, Unity Catalog, Delta Live, Spark Structured Streaming, Photon, etc.

Personal Qualities:

  • Excellent communication in English (verbal and written),
  • Curious, analytical mind, driven to solve sometimes complex problems,
  • Committed, able to work to deadlines, prioritise and juggle between tasks,
  • Thoughtful, collaborative, and proactive ready to engage with stakeholders across the globe,
  • High attention to detail and extremely organized, diligent, and focused,
  • Able to communicate effectively at all levels, including the ability to negotiate and resolve conflict; build & maintain effective working relationships with internal and external personnel,
  • Commitment to quality, strong sense of ownership and a thorough approach to work,
  • Demonstrate flexibility and ability to deliver, good team player, also able to work independently,
  • Ability to understand the mechanics and dependencies of complex technical systems and architectures.

What we offer:

  • Competitive compensation and benefit package
  • Great learning and development opportunities
  • Modern, award-winning office with a view of the Danube
  • Central location, excellent public transport
  • Youthful and supportive work environment
  • Additional holidays to compensate for Hungarian public holidays falling on a weekend
  • A steadily growing, 100+ year-old international company







INCO: “Cushman & Wakefield”
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  0  0
Category: Engineering Jobs

Tags: Agile Architecture Azure CI/CD Data analysis Databricks Data quality Data strategy Data warehouse ELT Engineering ETL Git Jupyter Pipelines PySpark Python R Scala Scrum Spark SQL Statistics Streaming TDD Testing T-SQL

Perks/benefits: Career development Competitive pay Team events

Region: Europe
Country: Hungary

More jobs like this