Data Architect- Part-Time (AU Media, Home-based)

Philippines - Remote

ConnectOS

Staff Leasing Business Offshoring Solutions in Australia

View all jobs at ConnectOS

Apply now Apply later

Schedule: Part- Time hours within Australian Time

What are we looking for?

Skills Required:

  • Minimum of 3 years of experience in a similar role. 
  • 3+ years of experience in Python programming 
  • 3+ years of experience with SQL 
  • Strong understanding of data modeling, design, ETL processes, and data warehousing concepts. 
  • Experience with data visualization tools (e.g., Tableau, Power BI, Looker Studio) and proficiency in dashboard development and reporting. 
  • Experience with CI/CD pipelines for infrastructure and data workflows. 
  • Familiarity with other cloud platforms (e.g., AWS, Azure) and multi-cloud environments. 
  • Knowledge of real-time data streaming architectures using tools like Pub/Sub or Kafka. 
  • Experience with machine learning infrastructure and deploying ML models on cloud platforms
  • Certification in Google Cloud (e.g., Professional Cloud Architect, Professional Data Engineer). Databricks certification will be advantageous as well. 

Nice to Have:

  • Familiarity with machine learning and AI concepts. 
  • Stakeholder management 
  • Knowledge of best practices in data security and compliance. 
  • Databricks 
  • Docker 
  • Kubernetes 
  • Problem-solving aptitude 
  • Agile methodologies experience 
  • Excellent communication skills with the ability to translate complex technical concepts into actionable insights for non-technical stakeholders. 

What will you do?

The Cloud Data Architect will design, implement, and maintain a scalable cloud data infrastructure on Google Cloud Platform (GCP). Key responsibilities include deploying tools like Apache Airflow on Kubernetes and Databricks, while collaborating with data engineers and analytics teams to support data workflows, optimize performance, security, and cost efficiency.

  • Design and Architect Cloud Infrastructure: Lead the design of GCP-based infrastructure to support data pipelines, machine learning, and analytics workloads, ensuring scalability and reliability. 
  • Install and Manage Apache Airflow on Kubernetes: Set up and maintain Airflow for orchestrating data workflows in a Kubernetes environment, ensuring seamless scheduling and execution of DAGs. 
  • Provision and Manage Databricks Environments: Set up Databricks clusters and integrations with GCP services, ensuring efficient use of resources for data processing and analytics. 
  • Implement Infrastructure as Code (IaC): Use tools like Terraform or Cloud Deployment Manager to automate the provisioning and management of cloud infrastructure. 
  • Optimize Cloud Data Services: Utilize GCP’s data products, such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage, to build scalable data architectures. 
  • Collaborate with Data Engineering Teams: Work closely with data engineers to design efficient data pipelines, optimize data workflows, and support data integration and processing at scale. 
  • Ensure Security and Compliance: Implement best practices for securing data infrastructure, including IAM policies, VPC configurations, and data encryption. 
  • Performance Monitoring and Optimization: Monitor and optimize the performance of cloud infrastructure, ensuring that it meets the needs of data pipelines and analytical workloads. 
  • Cost Management: Implement cost-effective solutions and continuously optimize cloud infrastructure to reduce operational costs without compromising performance. 
  • Documentation and Best Practices: Maintain detailed documentation of the cloud architecture and establish best practices for data infrastructure management. 

JOIN CONNECTOS NOW!

ConnectOS is certified as a Great Place to Work and is a top-rated Philippines employer of choice.

Our client is Australia’s largest independent publishing business with over 100 brands reaching 4.3 million Australians every month, with over 1000-5000 talented employees who are dedicated to serving their audiences, advertisers and customers. They have world-class print centers with an established client base across both newsprint and heat-set products. Their network includes 14 daily titles, such as The Canberra Times, Newcastle Herald, The Courier in Ballarat and The Examiner in Launceston.

#ConnectOS #ConnectOSCareers #TeamConnectOS

 

Equal Employment Statement

Employment decisions at ConnectOS will be conducted without consideration of factors such as age’, race, color, religion, gender, disability status, sexual orientation, gender identity or expression, genetic information, and marital status. ConnectOS ensures the full confidentiality of the data it processes.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Architecture Jobs

Tags: Agile Airflow Architecture AWS Azure BigQuery CI/CD Databricks Dataflow Data pipelines Data visualization Data Warehousing Docker Engineering ETL GCP Google Cloud Kafka Kubernetes Looker Machine Learning ML infrastructure ML models Pipelines Power BI Python Security SQL Streaming Tableau Terraform

Regions: Remote/Anywhere Asia/Pacific
Country: Philippines

More jobs like this