GCP-Certified Data Engineer

New York, NY-%LABEL POSITION TYPE REMOTE HYBRID%

Kaizen Analytix

Rapid Delivery, Continuous Improvement | We understand that analytics is much more than just the math. It’s the story behind the data. We help businesses understand what happened and show how they can learn from these outcomes to predict and...

View all jobs at Kaizen Analytix

Apply now Apply later

Job Title: GCP-Certified Data Engineer

Location: New York
Job Type: Hybrid (3 days in office)
Experience Level: Senior (5+ Years)

We are seeking a GCP-Certified Data Engineer with 5+ years of hands-on experience in cloud data engineering, ideally with direct experience migrating from Snowflake to BigQuery. This role is key to modernizing and scaling our data infrastructure, ensuring robust data ingestion, transformation, and performance optimization using Google Cloud’s native tools.

Key Responsibilities:

  • Design, develop, and optimize scalable ETL/ELT pipelines using Apache Beam (Dataflow) and Pub/Sub
  • Orchestrate complex data workflows using Cloud Composer (Apache Airflow)
  • Lead or support large-scale data migrations from AWS/Snowflake to BigQuery, including schema mapping and performance tuning
  • Enhance BigQuery performance through strategic use of partitioning, clustering, and effective resource management
  • Implement rigorous data quality frameworks, validation checks, and ensure pipeline observability and monitoring
  • Partner with analytics, product, and business teams to understand data needs and deliver timely, reliable data solutions

Required Skills and Experience:

  • GCP Certified (Professional Data Engineer preferred)
  • 5+ years of experience in cloud data engineering, including real-time and batch processing
  • Strong proficiency in Python and SQL
  • Deep understanding of BigQuery, Dataflow, Pub/Sub, and Cloud Storage
  • Experience with Cloud Composer (Airflow) for orchestration
  • Prior experience with ETL/ELT migrations, particularly from Snowflake to GCP
  • Proven track record in performance optimization and managing large datasets (structured & semi-structured)
  • Familiarity with Terraform or Infrastructure as Code (IaC)
  • Experience with CI/CD for data pipelines
  • Knowledge of AWS services and multi-cloud data strategies
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow AWS BigQuery CI/CD Clustering Dataflow Data pipelines Data quality ELT Engineering ETL GCP Google Cloud Pipelines Python Snowflake SQL Terraform

Regions: Remote/Anywhere North America
Country: United States

More jobs like this