Technical Architect -Google Cloud Platform (GCP)-Data Engineering

Pune, IN

Atos

We design digital solutions from the everyday to the mission critical — in artificial intelligence, hybrid cloud, infrastructure management, decarbonization and employee experience.

View all jobs at Atos

Apply now Apply later

Atos is a global leader in digital transformation,European number one in Cloud, Cybersecurity and High-Performance Computing, the Group provides end-to-end Orchestrated Hybrid Cloud, Big Data, Business Applications and Digital Workplace solutions. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and operates under the brands Atos, Atos Syntel, and Unify. Atos is a SE (Societas Europaea), listed on the CAC40 Paris stock index. 

 
The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space

Role Overview:

  
The Technical Architect specializing in Google Cloud Platform (GCP) Data Engineering designs, implements, and optimizes scalable data solutions. The jobholder has extensive experience with GCP services, data architecture, and cloud integration, ensuring high performance, security, and reliability.

 

Responsibilities: 

  • Architect and optimize data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
  • Design and implement GCP-based data architectures to meet business requirements.
  • Develop and optimize data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
  • Establish best practices for data governance, security, and compliance.
  • Collaborate with cross-functional teams to integrate GCP solutions with existing systems.
  • Monitor and troubleshoot GCP environments for performance and reliability.
  • Stay updated on GCP advancements and industry trends to recommend innovative solutions.

 

Key Technical Skills & Responsibilities 

  • Overall 12+ Yrs of experience with GCP and Data Warehousing concepts; Coding; reviewing; testing and debugging
  • Experience as architect on GCP implementation/or migration data projects.
  • Must have understanding of Data Lakes and Data Lake Architectures, best practices in data storage, loading, retrieving data from data lakes.
  • Experience in develop and maintain pipelines in GCP platform, understand best practices of bringing on-prem data to the cloud. File loading, compression, parallelization of loads, optimization etc.
  • Working knowledge and/or experience with Google Data Studio, looker and other visualization tools
  • Working knowledge in Hadoop and Python/Java would be an added advantage
  • Experience in designing and planning BI solutions, Debugging, monitoring and troubleshooting BI solutions, Creating and deploying reports and Writing relational and multidimensional database queries.
  • Any experience in NOSQL environment is a plus.
  • effective communication and required pre sales experience
  • Must be good with Python and PySpark for data pipeline building.
  • Must have experience of working with streaming data sources and Kafka.
  • GCP Services- Cloud Storage, BigQuery , Big Table, Cloud Spanner, Cloud SQL, DataStore/Firestore, DataFlow, DataProc, DataFusion, DataPrep, Pub/Sub, Data Studio, Looker, Data Catalog, Cloud Composer, Cloud Scheduler, Cloud Function

 

Eligibility Criteria: 

  • Bachelor’s degree in Computer Science, Data Engineering, or a related field.
  • Bachelor’s degree in Computer Science, Data Engineering, or a related field.
  • Proven experience as a GCP Architect or similar role.
  • Proficiency in GCP services like BigQuery, Dataflow, and Cloud Composer.
  • Strong understanding of data modeling, ETL/ELT processes, and cloud integration.
  • Excellent problem-solving and communication skills.
  • GCP certification (e.g., Professional Data Engineer, Professional Cloud Architect).
  • Experience with machine learning and AI integration in GCP environments.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Architecture Big Data BigQuery Bigtable Computer Science Dataflow Data governance Data pipelines Dataproc Data Studio Data Warehousing ELT Engineering ETL GCP Google Cloud Hadoop Java Kafka Looker Machine Learning NoSQL Pipelines PySpark Python Research Security SQL Streaming Testing

Region: Asia/Pacific
Country: India

More jobs like this