Senior Data Engineer

India, India

Singtel

The Singtel Group, Asia's leading communications group provides a diverse range of services including fixed, mobile, data, internet, TV, infocomms technology (ICT) and digital solutions.

View all jobs at Singtel

Apply now Apply later

NCS is a leading technology services firm that operates across the Asia Pacific region in over 20 cities, providing consulting, digital services, technology solutions, and more. We believe in harnessing the power of technology to achieve extraordinary things, creating lasting value and impact for our communities, partners, and people. Our diverse workforce of 13,000 has delivered large-scale, mission-critical, and multi-platform projects for governments and enterprises in Singapore and the APAC region. 

 

We’re searching for a Senior Data Engineer to be part of our diverse team of talent here at NCS India Delivery Center (IDC)!

 

About NCS India

NCS India Delivery Center (IDC) is an IT services provider for global NCS. We help integrate the core expertise of NCS global group of companies to deliver complete business solutions in the areas of IT Consulting, Managed Infrastructure Services, Application Services and Testing. IDC provides solutions to a variety of industry verticals such as Telecommunications, Education, Taxation, Bank, Travel, Energy & Power, E-Commerce, Finance and Healthcare. We envision to help our global clients across key industries to implement the latest in technology and accelerate their digital transformation journey.

 

 

Your role is only the beginning!

As a Senior Data Engineer, you will design, build, and maintain batch and real-time data pipelines on On-premises and cloud platforms to ingest data from the various source systems. Post ingestion, you will be responsible for integrating the data asset in the data model and making the data ready for consumption for the end-user use cases and downstream applications.

 

What we seek to accomplish together:

  • Enhance, optimize, and maintain existing data ingestion, transformation, and extraction pipelines and assets built for reporting and analytics on Cloud(Azure + Databricks) and Big Data(Cloudera ) platforms. 
  • Work with the Product Owner to understand the priorities and OKRs for the quarter and gather detailed requirements from the initiative owners or program sponsor as per the Epics planned to be delivered in the quarter.
  • Build new and optimized data pipelines, and assets to meet the end-user requirements. The Data pipelines must adhere to all the architecture, design, and engineering principles.
  • Design the data pipelines and assets to meet non-functional requirements (Security, reliability, performance, maintainability, scalability, and usability). Most importantly, they should be designed to be to keep the computing cost low on the Cloud.
  • Data wrangling, Data profiling, and data analysis for new datasets ingested from source systems and derived/built from existing datasets with the on-premises and cloud-native tools
  • Functionally understand the Data assets working with various SMEs and apply the transformation rules required to build the target data asset.
  • Coordinate with other teams for planning, design, governance, engineering and release management of processes and ensure timely and accurate delivery of data and services.

 
A little about you:

  • Bachelor's Degree in Math, Statistics, Finance, Economics, Computer Science, Information Technology, Management of Information Systems or equivalent
  • Minimally 7 years of experience working in Data Engineering
  • Minimally 2 years of experience leading a team of Data Engineers
  • Experience in building fully automated end-to-end data pipelines using on-premises or cloud-based data platforms.
  • Cloud experience – Azure-based analytics/reporting pipeline
  • Hands-on experience with scheduling tools like Airflow, Control M, etc.
  • Hands-on delivery of solutions for Reporting and Analytics use cases.
  • Hands-On with advanced SQL on Data Warehouse, Big Data and Data Bricks
  • Experience in data profiling, source-target mappings, ETL development, SQL optimization, testing, and implementation.
  • Experience working on Cloud DWs
  • Knowledgeable in Big Data tools like Spark (python/Scala), Hive, Impala, Hue, and storage (e.g., HDFS, HBase)
  • Knowledge of working with Azure Databricks.
  • Knowledgeable in CICD processes – Bitbucket/GitHub, Jenkins, Nexus etc.
  • Knowledgeable in managing structured and unstructured data types like JSON, xml, Avro
  • Track record of implementing databases and data access middleware and high-volume batch and (near) real-time processing

 

 

We are driven by our AEIOU beliefs - Adventure, Excellence, Integrity, Ownership, and Unity - and we seek individuals who embody these values in both their professional and personal lives. We are committed to our Impact: Valuing our clients, Growing our people, and Creating our future.  

 

Together, we make the extraordinary happen.  

 

Learn more about us at ncs.co and visit our LinkedIn career site.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture Avro Azure Big Data Bitbucket Computer Science Consulting Data analysis Databricks Data pipelines Data warehouse E-commerce Economics Engineering ETL Finance GitHub HBase HDFS Jenkins JSON Mathematics OKR Pipelines Python Scala Security Spark SQL Statistics Testing Unstructured data XML

Perks/benefits: Career development

Region: Asia/Pacific
Country: India

More jobs like this