Data Engineer

alseef, Bahrain

Apply now Apply later

About Tap

Tap Payments is revolutionizing online payments across the MENA region by connecting businesses with simple, unified payment experiences. We need exceptional talent to help us on this journey.

The Data Team

Our data team is the secret sauce behind our rocket ship to success. Unleashing the power of data analytics to drive strategic decisions and stay ahead of the curve. Pushing boundaries and challenging the status quo.


As a Tapster you will:

  •  Build and maintain efficient, scalable, and resilient batch and streaming data pipelines using Python, SQL, and orchestration tools like Apache Airflow or Prefect.
  • Design and implement ELT/ETL frameworks across distributed systems integrating structured and unstructured data from multiple source systems (e.g., APIs, logs, databases, cloud services).
  • Own data models and transformations within cloud data platforms such as Redshift, BigQuery, or Databricks.
  • Leverage AWS-native services (e.g., S3, Lambda, Glue, Step Functions, Kinesis,DMS, MSK) or GCP/Azure/OCI equivalents for building data lakehouse and warehouse solutions.
  • You will lead initiatives to modernize Tap’s data stack and support our journey toward AI/ML-driven intelligence.
  • Collaborate with analytics, product, and machine learning teams to provision clean, reliable, and timely datasets.
  • Apply CI/CD and Infrastructure-as-Code principles using tools like Terraform, GitLab CI, or AWS CDK to deploy and manage data infrastructure.
  • Enforce data governance, quality checks, lineage, and compliance frameworks using tools such as Great Expectations, Monte Carlo, or Apache Atlas.
  • Implementing robust data logging, alerting, and observability strategies using tools such as Datadog, Prometheus, or CloudWatch.
  • Partnering with DevSecOps and InfoSec to ensure compliance with ISO 27001, PCI-DSS, and regional data protection regulations.
  • Monitor and tune job performance, cost, and resource usage across environments.

 

  What you will bring to the party: 

  • 4 - 7 years of hands-on experience in data engineering, working in high-scale, cloud-native environments.
  • Proven track record of building and optimizing data pipelines, managing data lakes, and architecting data warehouse solutions in fast-paced technology, fintech, or digital-first organizations.
  • Bachelor’s or Master’s degree in Computer Science, Software Engineering, Information Systems, or a related quantitative discipline.
  • Deep proficiency in Python for data processing and scripting, and strong command of SQL for analytics and data modeling.
  • Experience with Structured and unstructured database systems(RDBMS, NOSQL, Columnary, NewSQL)
  • Experience with cloud data ecosystems like AWS (S3, Glue, Redshift, Lambda), Azure (Data Factory, Synapse), or GCP (BigQuery, Dataflow).
  • Experience and knowledge of BI tools
  • Familiarity with orchestration and workflow tools such as Apache Airflow or other equivalent scale tools..
  • Strong understanding of data architecture principles, dimensional modeling, and distributed data systems (e.g., Spark, Kafka, Flink).
  • Experience deploying infrastructure-as-code using Terraform, CloudFormation, or similar tools.
  • Solid grounding in data security, encryption, masking, access control, and compliance standards such as PCI-DSS, GDPR, or ISO 27001.
  • Comfort working with tools like Git, Docker, CI/CD pipelines, and containerized environments.
  • Ability to design systems with reliability, scalability, and performance in mind.


Are you ready to shape the future of payments in MENA?

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs Architecture AWS Azure BigQuery CI/CD CloudFormation Computer Science Data Analytics Databricks Dataflow Data governance Data pipelines Data warehouse Distributed Systems Docker ELT Engineering ETL FinTech Flink GCP Git GitLab ISO 27001 Kafka Kinesis Lambda Machine Learning Monte Carlo NoSQL Pipelines Python RDBMS Redshift Security Spark SQL Step Functions Streaming Terraform Unstructured data

Region: Middle East
Country: Bahrain

More jobs like this