Senior Data Engineer

Remote

Tawzef

Looking for professional HR services? Tawzef offers recruitment, manpower outsourcing, employer of record, payroll outsourcing, psychometric assessments, and HR consultancy in Egypt.

View all jobs at Tawzef

Apply now Apply later

This is a remote position.

● Design, develop, and maintain scalable, secure, and cost-effective data infrastructure on cloud platforms such as AWS, GCP, or Azure.
● Implement cloud-native solutions like data lakes, warehouses (e.g., Snowflake, BigQuery, Redshift), and serverless computing. Build, deploy, and manage real-time and batch data pipelines using tools such as Apache Airflow, Apache Kafka, or cloud- native orchestration solutions.
● Optimize ETL/ELT processes for seamless data ingestion, transformation, and integration from diverse sources. Ensure high performance, scalability, and cost-efficiency of cloud data solutions through tuning and capacity planning.
● Implement caching strategies and data partitioning techniques for large-scale datasets. Enforce best practices for data privacy, security, and regulatory compliance (e.g., GDPR, PCI DSS).
● Implement identity management, encryption, and monitoring tools to safeguard sensitive data. Work closely with cross-functional teams to understand business requirements and deliver cloud-based data solutions.
● Mentor junior data engineers and contribute to team knowledge sharing. Evaluate emerging cloud and big data technologies to recommend enhancements to our data infrastructure.
● Drive automation in deployment, testing, and monitoring using CI/CD pipelines and Infrastructure-as-Code (e.g., Terraform, CloudFormation).

Requirements

● Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field. 5+ years of data engineering experience with at least 3 years working on cloud-based solutions.
● Proven experience in the fintech domain is a significant advantage. Strong expertise in cloud platforms such as AWS (S3, Redshift, Glue), GCP (BigQuery, Dataflow), or Azure (Data Factory, Synapse Analytics).
● Proficiency in programming languages like Python, Java, or Scala.
● Solid knowledge of database systems and SQL, including NoSQL technologies (e.g., DynamoDB, MongoDB).
● Experience with big data tools such as Apache Spark, Hadoop, or Databricks.
● Strong familiarity with containerization (Docker) and orchestration (Kubernetes).
● Hands-on experience with Infrastructure-as-Code tools (Terraform, CloudFormation)

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  0  0
Category: Engineering Jobs

Tags: Airflow AWS Azure Big Data BigQuery CI/CD CloudFormation Computer Science Databricks Dataflow Data pipelines Docker DynamoDB ELT Engineering ETL FinTech GCP Hadoop Java Kafka Kubernetes MongoDB NoSQL Pipelines Privacy Python Redshift Scala Security Snowflake Spark SQL Terraform Testing

Region: Remote/Anywhere

More jobs like this