Sr. Data Engineer (Platform)

Taipei

Gogolook

As the industry’s leading Trust-Tech company, Gogolook uses its huge database and advanced AI technology to provide services in the field of communication fraud prevention, financial technology, and SaaS risk services.

View all jobs at Gogolook

Apply now Apply later

About usGogolook is a leading TrustTech company established in 2012. With "Build for Trust" as its core value, it aims to create an AI- and data-driven global anti-fraud network as well as Risk Management as a Service. From multi-communication to fintech, and SaaS, Gogolook creates trustworthy empowerment with the use of technology in various fields.
A founding member of the Global Anti-Scam Alliance (GASA), Gogolook has also teamed up with a number of institutes such as the Taiwan National Police Agency Criminal Investigation Bureau, the Financial Supervisory Service of South Korea, Thai Royal Police, the Fukuoka city government, the Philippines Cybercrime Investigation and Coordinating Center, and the Royal Malaysia Police and state government to fight fraud and ultimately, to build a trustworthy communication network with the largest number database in East Asia and Southeast Asia.
In July 2023, TrustTech provider Gogolook (stock code: 6902), was listed on the Taiwan Innovation Board (TIB). In May 2025, Gogolook officially submitted an application to the Taiwan Stock Exchange for reclassification to the General Board, positioning it to become the first new economy software company to make this transition.
Why you should join Gogolook1. Influential products: What we make are meaningful products that create values for society and defend against frauds.2. Emphasize self-growth: We encourage technical community activities, subsidize tickets for conferences and workshops so that learning is continuously supported by the company.3. Unleash your talent: We respect the professional opinions of everyone, encourage team members to discuss with each other, and make awesome products together.4. Transparent culture: We publicly share the company's information to all, every member can read and feedback, and become a part of participating in the proposal.
As a Data Platform Engineer at Gogolook, you will spearhead the design, implementation, and evolution of our enterprise data infrastructure. You will collaborate with cross-functional teams to architect solutions that ensure data availability, reliability, and scalability to meet our organization's exponential data growth. If you're passionate about implementing cutting-edge DataOps practices and driving innovation through automated data workflows, we invite you to transform our data landscape as a Data Platform Engineer.

Responsibilities

  • Architect and engineer high-performance ETL/ELT pipelines capable of processing and transforming large data volumes with minimal latency.
  • Design and optimize scalable, cross-functional data architectures that establish resilient data pipelines serving multiple product teams, business intelligence platforms, and customer-facing applications.
  • Implement robust data governance frameworks and security protocols to safeguard data privacy while ensuring compliance with global regulatory requirements.
  • Develop automated CI/CD pipelines and testing frameworks for data processes to accelerate development cycles, minimize technical debt, and enhance overall data reliability.

Minimum qualifications

  • Minimum of 3 years of hands-on software development experience in languages such as Python, Scala, Java, Golang, etc.
  • With expertise in optimizing data flow and processing efficiency, and experience handling data throughput of up to 10 million records per hour.
  • Hands-on experience with cloud platforms like AWS, GCP, or Azure.
  • Proficiency in developing robust data pipelines, including data collection and ETL (Extract, Transform, Load) processes.
  • Experience designing and implementing various data platform components, including data ingestion, storage, data warehousing, and data orchestration.
  • Experience deploying infrastructure as code using CloudFormation or Terraform in AWS, GCP, or Azure.
  • Experience in continuous integration and continuous deployment (CI/CD) processes.

Preferred qualifications

  • Experience in Docker, Kubernetes, or Helm charts.
  • Experience in machine learning platforms, like KubeFlow, MLflow, or Vertex AI.
  • Has experience with API design and implementation.
  • Has experience using streaming tools like Kafka, AWS Kinesis, Spark streaming, or Flink. 
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: APIs Architecture AWS Azure Business Intelligence CI/CD CloudFormation Data governance DataOps Data pipelines Data Warehousing Docker ELT ETL FinTech Flink GCP Golang Helm Java Kafka Kinesis Kubeflow Kubernetes Machine Learning MLFlow Pipelines Privacy Python Scala Security Spark Streaming Terraform Testing Vertex AI

Perks/benefits: Career development Conferences Startup environment

Region: Asia/Pacific
Country: Taiwan

More jobs like this