Data Engineer

Iasi, Romania

Suvoda

IRT, eConsent, eCOA, and ePatient solutions to help you wisely guide novel science through complex clinical trials.

View all jobs at Suvoda

Apply now Apply later

Data Engineer

Department: Product Architecture

Reports to: Manager, Data Engineering

Suvoda is seeking a talented and motivated Data Engineer to support the development of our modern data platform. You’ll help build domain-oriented data products using GraphQL APIs and contribute to near real-time reporting through AWS DMS replication to Aurora PostgreSQL.

You’ll also work on developing and maintaining ETL/ELT pipelines using AWS Glue and PySpark, ensuring efficient and scalable data processing across our platform.

Responsibilities:

  • Assist in implementing a data mesh architecture using GraphQL APIs to expose domain-owned data products.
  • Help build and maintain an AWS-based data lake using S3, Glue, Lake Formation, Athena, and Redshift.
  • Develop and maintain ETL/ELT pipelines using AWS Glue and PySpark for batch and streaming data workloads.
  • Support AWS DMS pipelines to replicate data into Aurora PostgreSQL for near real-time analytics.
  • Follow best practices for data governance, quality, observability, and API design.
  • Collaborate with product, engineering, and analytics teams to deliver reliable data solutions.
  • Contribute to CI/CD and automation efforts for data infrastructure and pipelines.
  • Stay informed about new tools and technologies to support platform improvements.

Requirements:

  • Bachelor’s degree in a technical field such as Computer Science or Mathematics.
  • At least 2 years of experience in data engineering or a related field.
  • Familiarity with GraphQL APIs for data access.
  • Experience with AWS Glue and PySpark for ETL/ELT development.
  • Exposure to AWS data lake technologies (S3, Glue, Lake Formation, Athena, Redshift).
  • Understanding of AWS DMS and Aurora PostgreSQL for data replication.
  • Basic knowledge of data mesh principles and distributed data systems.
  • Proficiency in Python and SQL; familiarity with infrastructure-as-code tools is a plus.
  • Experience with data modeling, orchestration tools (e.g., Airflow), and CI/CD pipelines.
  • Strong problem-solving and communication skills.

Preferred Qualifications:

  • Master’s degree or relevant certifications.
  • Experience with event-driven architectures (e.g., Kafka, Kinesis).
  • Familiarity with data cataloging and metadata management tools.
  • Awareness of data privacy and compliance standards (e.g., GDPR, HIPAA).
  • Exposure to agile development and DevOps practices.

We are aware that an individual(s) are fraudulently representing themselves as Suvoda recruiters and/or hiring managers. Suvoda will never request personal information such as your bank account number, credit card number, drivers license or social security number — or request payment from you — during the job application or interview process. Any emails from the Suvoda recruiting team will come from a @suvoda.com email address. You can learn more about these types of fraud by referring to this FTC consumer alert.Ā 

As set forth in Suvoda’s Equal Employment Opportunity policy, we do not discriminate on the basis of any protected group status under any applicable law.

If you are based in California, we encourage you to read this important information for California residents linked here.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index šŸ’°

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow APIs Architecture Athena AWS AWS Glue CI/CD Computer Science Data governance DevOps Driver’s license ELT Engineering ETL GraphQL Kafka Kinesis Lake Formation Mathematics Pipelines PostgreSQL Privacy PySpark Python Redshift Security SQL Streaming

Region: Europe
Country: Romania

More jobs like this