Data Engineer

Zagreb, Croatia

Apply now Apply later

Company Description

Ergomed  is a rapidly expanding full service mid-sized CRO specialising in Oncology and Rare Disease.  

Since its foundation in 1997 the company has grown organically and steadily by making strategic investments and landmark acquisitions, with operations in Europe, North America and Asia 

Our company allows for employee visibility (you have a voice!) creative contribution and realistic career development.  

We have nourished a true international culture here at Ergomed.  

We value employee experience, well-being and mental health and we acknowledge that a healthy work life balance is a critical factor for employee satisfaction and in turn nurtures an environment from which a high-quality client service can be achieved. 

Come and join us in this exciting journey to make a positive impact in patient’s lives.  

Job Description

The data engineer will play a pivotal role in building and operationalizing the minimally inclusive data necessary for the enterprise data management, analytics and business intelligence initiatives following industry standard practices and tools.

The bulk of the data engineer’s work would be in building, managing, and optimizing data integration pipelines and then moving these data pipelines effectively into production for key analytics consumers (business domain owners, business/data analysts, product owners, decision makers on operational, tactical, and strategic levels) or any group that needs curated insights for data informed problem-solving use cases across the enterprise.

Qualifications

Education

A master’s degree in computer science, data science, software engineering, or related field.

Experience

At least three years of experience in BI development, data analytics, data engineering, software engineering, or a similar role.

Expertise in data modelling, ETL development, data architecture, master data management.

Strong experience with various Data Management architectures like data warehouse, data lake, LakeHouse architecture, Data Fabric vs Data Mesh concepts and the supporting processes like data Integration, MPP engines, governance, metadata management.

Intermediate experience in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines.

Strong experience in designing, building, and deploying data solutions that capture, explore, transform, and utilize data to create data products and support data informed initiatives. Proficiency in ETL/ELT, data replication/CDC, message-oriented data movement, API design and access and upcoming data ingestion and integration technologies such as stream data integration and data virtualization.

Basic knowledge and ability in data science languages/tools such as R, Python, TensorFlow, Databricks, Dataiku, SAS, or others.

Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (i.e. AWS, OCI, Azure, GCP) and modern data warehouse tools (Snowflake, Databricks, etc)

Strong experience with database technologies such as SQL, NoSQL, PostgreSQL, Oracle, Hadoop, Teradata etc.

Intermediate experience working with popular data discovery, analytics, and BI software tools like PowerBI, Tableau, Qlik Sense, Looker, ThoughtSpot, MicroStrategy or others for semantic-layer-based data discovery is advantage.

Expert problem-solving skills, including debugging skills, allowing the determination of sources of issues in unfamiliar code or systems, and the ability to recognize and solve repetitive problems.

 

Additional Information

We prioritize diversity, equity, and inclusion by creating an equal opportunities workplace and a human-centric environment where people of all cultural backgrounds, genders and ages can contribute and grow.  

To succeed we must work together with a human first approach. Why? because our people are our greatest strength leading to our continued success on improving the lives of those around us. 

 We offer: 

  • Training and career development opportunities internally  

  • Strong emphasis on personal and professional growth 

  • Friendly, supportive working environment 

  • Opportunity to work with colleagues based all over the world, with English as the company language 

Our core values are key to how we operate, and if you feel they resonate with you then Ergomed is a great company to join!  

Quality 

Integrity & Trust  

Drive & Passion  

Agility & Responsiveness  

Belonging 

Collaborative Partnerships  

We look forward to welcoming your application. 

 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  4  0  0
Category: Engineering Jobs

Tags: Airflow APIs Architecture AWS Azure Business Intelligence Computer Science Data Analytics Databricks Data management Data pipelines Data warehouse ELT Engineering ETL GCP Hadoop Kafka Looker MPP NoSQL Oracle Pipelines PostgreSQL Power BI Python Qlik R SAS Snowflake Spark SQL Tableau TensorFlow Teradata

Perks/benefits: Career development Startup environment

Region: Europe
Country: Croatia

More jobs like this