Data Engineer
Taguig, Philippines
Cloud Bridge
Join a UK-based tech-driven company at the forefront of financial data innovation. We're looking for a skilled Data Engineer in the Philippines.
Our client is seeking a capable leader and engineer with a passion for data to join our global team. As the foundational member of their new team in Manila, you will play a pivotal role in recruiting, training, and organising a high-performing engineering function. You and your team will be responsible for designing, building, and maintaining robust systems and infrastructure for processing and analysing large and complex datasets. You will collaborate closely with data scientists, analysts, product managers, and other stakeholders to understand requirements and develop scalable solutions. This is a unique opportunity to shape the future of our data infrastructure and processes, driving impactful projects from the ground up.
Key Responsibilities:
As a Data Engineer, you will play a crucial role in developing and maintaining data pipelines, data warehouses, and cloud-based systems. Your key responsibilities will include:
- Lead out the build of a new team in Manila: Lead efforts to recruit and mentor new team members in Manilla, fostering a collaborative attitude and high-performing team environment.
- Create performant batch and streaming data pipelines capable of handling large volumes of data in both real-time market data and batch formats (Apache Airflow, Apache Kafka, Apache Flink).
- Utilize tools like Snowflake to construct reliable and scalable data warehousing solutions.
- Employ Amazon Web Services (AWS) or Google Cloud Platform (GCP) to build robust and scalable cloud-based systems.
- Improve Continuous Integration/Continuous Deployment processes using tools like Jenkins and GitLab to automate development, testing, and deployment.
- Oversee the maintenance and monitoring of various data applications, pipelines, and databases.
- Participate in daily stand-ups and team meetings to coordinate and collaborate on development efforts.
- Interacting with stakeholders to create and build analytical products and services
Experience / Competences:
- 5+ years of work experience in data engineering or similar
- Bachelor's degree in computer science, engineering, mathematics, or a related technical field.
- Experience hiring, mentoring and leading junior engineers
- Proficiency in Python with at least one data manipulation framework (e.g. Pandas, Dask, Pyspark) and SQL; knowledge of other languages like Java, C#, or C++ is advantageous.
- Experience with REST APIs and building scalable API data platform (e.g. FastAPI)
- Experience with AWS, Snowflake, Kubernetes, and Airflow.
- Experience with Prometheus, Grafana, CloudWatch for monitoring and alerting
- Comfortable with Linux and command line operations.
- Understanding of ETL processes and event streaming technologies (such as Kafka, Flink, etc.).
- Strong written and verbal communication skills, with the ability to effectively interact with both business and technical teams.
- Prior experience working with financial market data is beneficial
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs AWS Computer Science Data pipelines Data Warehousing Engineering ETL FastAPI Flink GCP GitLab Google Cloud Grafana Java Jenkins Kafka Kubernetes Linux Mathematics Pandas Pipelines PySpark Python Snowflake SQL Streaming Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.