Data Engineer -GCP

Pune, India

Apply now Apply later

Company Description

T-Systems Information and Communication Technology India Private Limited (T-Systems ICT India Pvt. Ltd.) is a proud recipient of the prestigious Great Place To Work® Certification™. As a wholly owned subsidiary of T-Systems International GmbH, T-Systems India operates across Pune, Bangalore, and Nagpur, boasting a dedicated team of 3500+ employees providing services to group customers. T-Systems offers integrated end-to-end IT solutions, driving the digital transformation of companies in all industries, including automotive, manufacturing, logistics, and transportation, as well as healthcare and the public sector. T-Systems develops vertical, company-specific software solutions for these sectors. T-Systems International GmbH is an information technology and digital transformation company with a presence in over 20 countries and a revenue of more than €4 billion. T-Systems is a world-leading provider of digital services and has over 20 years of experience in the transformation and management of IT systems. As a subsidiary of Deutsche Telekom and a market leader in Germany, T-Systems International offers secure, integrated information technology and digital solutions from a single source.

Job Description

Role : Bigdata Enginer

Experience – 6 to 8 Years
Location - Pune

 

JD :

  • Minimum 5 to 9 Years of in Big Data & Data related technology experience
  • Expert level understanding of distributed computing principles
  • Expert level knowledge and experience in Apache Beam
  • Very good knowledge of Python and SQL  (Java is plus)
  • Hands-on streaming experience (Kafka, Pub/Sub, Dataflow, Flink, etc.)
  • Hands-on GCP Data Flow, BigQuery is required
  • Strong understanding of SQL queries, joins, and relational schemas
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
  • Experience with Data Modeling
  • Knowledge of ETL techniques and frameworks
  • Experience with designing and implementing Big data / distributed solutions
  • Experience with Docker (Kubernetes is a plus)
  • Practitioner of AGILE methodology e.g JIRA 
  • Excellent communication , Presentation skills
  • Should have CICD and GIT hands on Experience
  • Telco Network exposure is preferred.

Skill Set :

  • Mandatory Skills Level
  • SQL Expert
  • Python Expert
  • Apache Beam Expert
  • GCP Expert
  • Spark Good
  • No Sql DB Expert
  • OZIEE /Airflow/Cloud Composer Good
  • CICD / GIT  Expert
  • Data Modelling Good
  • Stakeholder Management Good
  • DATA Flow /Apache beam Pipelines Good
  • Should have done End to End Project development Good
  • Data Analysis Expert
  • Communication Good
  • Kafka / Spark Streaming Expert
  • Good to Have Skills Level
  • Docker , Kubernetes Good
  • Network Domain Added Advantage
  • NATCO/TELCO  Exposure Good
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow Big Data BigQuery Cassandra Data analysis Dataflow Docker ETL Flink GCP Git HBase Java Jira Kafka Kubernetes MongoDB NoSQL Pipelines Python Spark SQL Streaming

Region: Asia/Pacific
Country: India

More jobs like this