CCL114 Python Engineer

Remote (Asia)

Cybernetic Controls Limited

We are a trusted UK recruitment services provider,offering expert staffing solutions across various sectors.Explore Cybernetic Controls today

View all jobs at Cybernetic Controls Limited

Apply now Apply later

CCL114 Python Engineer

Department: Engineering

Employment Type: Full Time

Location: Remote (Asia)


Description

OverviewCCL builds high quality automation frameworks and bespoke software solutions for our clients in the finance industry. With an aim to help organizations improve their operational processes and reduce overall risk and increase efficiency. Another key objective of CCL is to enable greater transparency for firms and regulators. CCL are also passionate about removing waste and making things simple. We want to help inspire a new generation of Robotics and Automation projects across industry that are focused on delivering a highly skilled virtual workforce with cognitive and robotic capabilities. The CCL team is focused on helping firms improve operational productivity and reduce challenges. Read more on the Cybernetic Controls website.

Job Summary
We are seeking an experienced Python Engineer with strong skills in PySpark, Test-Driven Development (TDD), and modern delivery practices such as CI/CD and trunk-based development. The ideal candidate is passionate about building reliable, scalable data pipelines and delivering production-ready code with a fast, iterative cadence. Experience with automated testing across the software lifecycle is essential. 

Key Responsibilities

  • Design, implement, and maintain robust data pipelines using PySpark and Python
  • Write automated unit, integration, performance, and end-to-end tests to ensure high code quality. 
  • Follow TDD principles to develop maintainable and testable solutions. 
  • Work within a CI/CD pipeline and a trunk-based development workflow to support frequent, reliable deployments. 
  • Optimize PySpark jobs for performance, reliability, and scalability. 
  • Collaborate cross-functionally to define and deliver business-critical data solutions.

Skills, Knowledge and Expertise

Required Qualifications 
  • 3+ years of Python programming experience, ideally in a data engineering or backend context. 
  • 2+ years of experience working with PySpark in production environments. 
  • Hands-on experience with TDD and a track record of writing comprehensive automated tests. 
  • Solid experience with CI/CD pipelines, Git, and trunk-based development workflows. 
  • Strong understanding of distributed data processing, Spark job optimization, and working with structured data formats (e.g., Parquet, JSON). 
  • Agile mindset and excellent collaboration skills. 
Preferred Qualifications 
  • Experience with orchestration tools such as Apache Airflow or Luigi
  • Familiarity with AWS services, especially: 
  • AWS Glue for serverless ETL pipelines 
  • AWS Step Functions for workflow orchestration 
  • Containerization experience using Docker and Kubernetes
  • Exposure to cloud-native data platforms like AWS EMR, Azure Databricks, or GCP Dataproc
  • Knowledge of data quality monitoring, observability, and error handling in data pipelines. 

Benefits

  • Competitive salary package 
  • Private healthcare contribution 
  • Annual pay review 
  • Regular team socials 
  • Working within a culture of innovation and collaboration 
  • Opportunity to play a key role in a pioneering growth company
  • Company Laptop will be provided
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Airflow AWS AWS Glue Azure CI/CD Databricks Data pipelines Dataproc Data quality Docker Engineering ETL Finance GCP Git JSON Kubernetes Parquet Pipelines PySpark Python Robotics Spark Step Functions TDD Testing

Perks/benefits: Competitive pay Gear Startup environment Transparency

Regions: Remote/Anywhere Asia/Pacific
Country: Philippines

More jobs like this