Data Engineer

Abu Dhabi, United Arab Emirates

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Contango

Specialists in Transformation Your Trusted Partner for Sustainable Business Success WHO WE ARE About Contango Contango is your strategic partner for transformative growth and sustained success. Our team excels in providing comprehensive growth...

View all jobs at Contango

Apply now Apply later

About the Role

We are seeking a motivated and technically versatile Data Engineer to join our team. You will play a key role in delivering data platforms, pipelines, and ML enablement within a Databricks on Azure environment.

As part of a stream-aligned delivery team, you’ll work closely with Data Scientists, Architects, and Product Managers to build scalable, high-quality data solutions for clients. You'll be empowered by a collaborative environment that values continuous learning, Agile best practices, and technical excellence.

Ideal candidates have strong hands-on experience in Databricks, Python, ADF and are comfortable in fast-paced, client-facing consulting engagements.

Skills and Experience requirements

    1. Technical
    • Databricks (or similar) e.g. Notebooks (Python, SQL), Delta Lake, job scheduling, clusters, and workspace management, Unity Catalog, access control awareness
    • Cloud data engineering – ideally Azure, including storage (e.g., ADLS, S3, ADLS), compute, and secrets management
    • Development languages such as Python, SQL, C#, javascript etc. especially data ingestion, cleaning, and transformation
    • ETL / ELT – including structured logging, error handling, reprocessing strategies, APIs, flat files, databases, message queues, event streaming, event sourcing etc.
    • Automated testing (ideally TDD), pairing/mobbing. Trunk Based Development, Continuous Deployment and Infrastructure-as-Code (Terraform)
    • Git and CI/CD for notebooks, data pipelines, and deployments

       2. Integration & Data Handling

  • Experienced in delivering platforms for clients – including file transfer, APIS (REST etc.), SQL/NoSQL/graph databases, JSON, CSV, XML, Parquet etc
  • Data validation and profiling - assess incoming data quality. Cope with schema drift, deduplication, and reconciliation
  • Testing and monitoring pipelines: Unit tests for transformations, data checks, and pipeline observability

       3.  Working Style

  • Comfortable leveraging the best of lean, agile and waterfall approaches. Can contribute to planning, estimation, and documentation, but also collaborative daily re-prioritisation
  • Able to explain technical decisions to teammates or clients
  • Documents decisions and keeps stakeholders informed
  • Comfortable seeking support from other teams for Product, Databricks, Data architecture
  • Happy to collaborate with Data Science team on complex subsystems

 Nice-to-haves

  • MLflow or light MLOps experience (for the data science touchpoints)
  • Dbt / dagster / airflow or similar transformation tools
  • Understanding of security and compliance (esp. around client data)
  • Past experience in consulting or client-facing roles

Candidate Requirements

  • 5–8 years (minimum 3–4 years hands-on with cloud/data engineering, 1–2 years in Databricks/Azure, and team/project leadership exposure)
  • Bachelor’s degree in Computer Science, Data Engineering, Software Engineering, Information Systems, Data Engineering

Disclaimer:

This job posting is not open to recruitment agencies. Any candidate profile submitted by a recruitment agency will be considered as being received directly from an applicant. Contango reserves the rights to contact the candidate directly, without incurring any obligations or liabilities for payment of any fees to the recruitment agency.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  0  0
Category: Engineering Jobs

Tags: Agile Airflow APIs Architecture Azure CI/CD Computer Science Consulting CSV Dagster Databricks Data pipelines Data quality dbt ELT Engineering ETL Git JavaScript JSON Machine Learning MLFlow MLOps NoSQL Parquet Pipelines Python Security SQL Streaming TDD Terraform Testing XML

Perks/benefits: Career development

Region: Middle East

More jobs like this