Data Engineer (Hybrid Cloud, Reporting, and Visualization)

Quezon City, Metro Manila, Philippines

Apply now Apply later

Overview:

As a Data Engineer, you will play a critical role in transforming raw data into valuable insights that drive our business decisions. You will design, develop, and maintain data pipelines and infrastructure across hybrid cloud environments, while also building robust reporting and visualization solutions.

 

Responsibilities:

  1. Data Pipeline Development: Design, build, and maintain scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., databases, APIs, files) into data warehouses or data lakes.
  2. Hybrid Cloud Infrastructure: Manage and optimize data infrastructure across hybrid cloud environments, leveraging cloud-native services and on-premises resources.
  3. Data Quality: Ensure data quality through implementation of data validation, cleansing, and standardization processes.
  4. Reporting and Visualization: Develop interactive reports and dashboards using tools like Power BI, Tableau, or Looker to provide actionable insights to stakeholders.
  5. Data Governance: Adhere to data governance policies and procedures, including data security, privacy, and compliance regulations.
  6. Data Modeling: Design and implement data models (e.g., dimensional, normalized) to optimize data storage and retrieval.
  7. Automation: Automate data pipelines and processes using scripting languages (e.g., Python, SQL) and automation tools.
  8. Collaboration: Work closely with data analysts, scientists, and business users to understand their requirements and deliver relevant data solutions.

Requirements

  1. Experience: Proven experience as a Data Engineer or similar role with a focus on data pipelines, cloud infrastructure, and reporting.
  2. Technical Skills: Strong understanding of data engineering concepts, tools, and technologies (e.g., SQL, Python, ETL tools, cloud platforms).
  3. Cloud Platforms: Experience with major cloud platforms (e.g., AWS, Azure, GCP) and their data-related services (e.g., data warehouses such as BigQuery/Redshift, data lakes, data pipelines).
  4. Data Modeling: Proficiency in data modeling techniques (e.g., dimensional, normalized) and data warehouse design.
  5. Reporting and Visualization: Expertise in using reporting and visualization tools (e.g.,  Looker, Power BI, Tableau) to create interactive dashboards.
  6. Problem-Solving: Ability to troubleshoot complex data-related issues and find innovative solutions.
  7. Communication: Excellent communication skills to collaborate effectively with cross-functional teams.
  8. Certifications: Preferred certifications (e.g., AWS Certified Data Engineer, Azure Certified Data Engineer, GCP Certified Professional Data Engineer).

 

Additional Skills (Preferred):

  1. Experience with data warehousing and data lake technologies (e.g., Google BigQuery, Amazon Redshift, Snowflake, Databricks)
  2. Knowledge of data analytics and machine learning concepts
  3. Familiarity with data governance and compliance frameworks (e.g., HIPAA, GDPR, CCPA)
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: APIs AWS Azure BigQuery Data Analytics Databricks Data governance Data pipelines Data quality Data warehouse Data Warehousing Engineering ETL GCP Looker Machine Learning Pipelines Power BI Privacy Python Redshift Security Snowflake SQL Tableau

Region: Asia/Pacific
Country: Philippines

More jobs like this