DES - GCP Data Engineering Architect

Hyderabad, TS, India

Sutherland

Sutherland is a business process transformation company that rethinks & rebuilds business processes for the digital age. Learn more here.

View all jobs at Sutherland

Apply now Apply later

Company Description

Sutherland is seeking an attentive and goal-oriented person to join us as a GCP Data Engineering Architect. We are a group of driven and hard-working individuals. If you are looking to build a fulfilling career and are confident you have the skills and experience to help us succeed, we want to work with you!

Job Description

Position Summary
Lead the GCP pillar within the Data Engineering CoE, establishing technical standards, best practices, and
reusable accelerators for Google Cloud Platform data implementations. This role is critical for supporting
high-value client engagements including Verizon and other GCP-focused opportunities in our pipeline.
Key Responsibilities
Develop architecture patterns and implementation accelerators for GCP data platforms
Establish best practices for BigQuery, Dataflow, Dataproc, and other GCP data services
Support pre-sales activities for GCP-based opportunities, with particular focus on Verizon
Design migration pathways from legacy systems to GCP
Create technical documentation and playbooks for GCP implementations
Mentor junior team members on GCP best practices
Work with cloud-agnostic platforms (Databricks, Snowflake) in GCP environments
Build deep expertise in enterprise-scale GCP deployments
Collaborate with other pillar architects on cross-platform solutions
Represent the company's GCP capabilities in client engagements

Qualifications

Qualifications
10+ years of data engineering experience with minimum 5+ years focused on GCP
Deep expertise in BigQuery, Dataflow, Dataproc, and Cloud Storage
Experience implementing enterprise-scale data lakes on GCP
Strong knowledge of data integration patterns and ETL/ELT frameworks in GCP
Experience with migration from legacy systems to GCP
Google Cloud Professional Data Engineer certification required

Experience developing reusable templates and accelerators
Strong coding skills in Python, Spark, and SQL
Experience with Terraform, Deployment Manager, or other infrastructure as code tools
Excellent communication skills with the ability to explain complex technical concepts

 

Additional Information

Impact Expectations
Build credibility with Verizon and other GCP-focused clients within 60 days
Create 2-3 GCP accelerators for common implementation patterns in the first 90 days
Support pre-sales activities for GCP opportunities within the current pipeline
Establish a technical implementation framework for GCP data platforms

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Architecture BigQuery Databricks Dataflow Dataproc ELT Engineering ETL GCP Google Cloud Python Snowflake Spark SQL Terraform

Region: Asia/Pacific
Country: India

More jobs like this