Data Engineer ( Mainframe + Teradata) - Senior Associate - Operate.

Bangalore (SDC) - Bagmane Tech Park, India

PwC

We are a community of solvers combining human ingenuity, experience and technology innovation to help organisations build trust and deliver sustained outcomes.

View all jobs at PwC

Apply now Apply later

Line of Service

Advisory

Industry/Sector

Not Applicable

Specialism

Managed Services

Management Level

Senior Associate

Job Description & Summary

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients.

Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services.

Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow.

Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to:

  • Respond effectively to the diverse perspectives, needs, and feelings of others.
  • Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems.
  • Use critical thinking to break down complex concepts.
  • Understand the broader objectives of your project or role and how your work fits into the overall strategy.
  • Develop a deeper understanding of the business context and how it is changing.
  • Use reflection to develop self awareness, enhance strengths and address development areas.
  • Interpret data to inform insights and recommendations.
  • Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements.

Job Profile Name: 

Data Mainframe Engineer 

Senior Associate 

MF + Teradata + Datastage, AWS, DataBricks , SQL, Delta Live tables, Delta tables, Spark,Kafka, Sprak Streaming, MQ, ETL 

US 

 

Job Title: 

Senior Associate 

 

KEY SKILLS - MF + Teradata + Datastage, AWS, DataBricks , SQL, Delta Live tables, Delta tables, Spark,Kafka, Sprak Streaming, MQ, ETL 

 

Mainframe and Teradata DataStage Associate 

Summary:. 

 
 

Minimum Degree Required: Bachelor's degree in computer science/IT or relevant field  

 

 

Degree Preferred:  Master’s degree in computer science/IT or relevant field  

 
Minimum Years of Experience: 6 – 9 Year 
 

Certifications Required: NA 

 

 

Job Summary: 

We are seeking a skilled and experienced IT professional to join our team as a Mainframe and Teradata DataStage Associate. The successful candidate will be responsible for developing, maintaining, and optimizing ETL processes using IBM DataStage, as well as managing and supporting data operations on Mainframe and Teradata platforms. 

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in managing and optimizing data systems on Mainframe, Teradata, and IBM DataStage, as well as expertise in modern data platforms such as AWS, Databricks, and Apache Spark. You will play a key role in designing, developing, and maintaining robust data pipelines and ETL processes, ensuring efficient data integration and transformation across our enterprise systems. 

Key Responsibilities: 

  • Design, develop, and optimize ETL processes using IBM DataStage, ensuring seamless data integration from diverse sources. 

  • Manage and maintain data operations on Mainframe and Teradata platforms, ensuring data accuracy and system performance. 

  • Develop and deploy scalable data solutions on AWS using Databricks, Spark, and Delta Lake technologies. 

  • Implement and manage streaming data solutions using Kafka and Spark Streaming for real-time data processing. 

  • Collaborate with cross-functional teams to gather data requirements and translate them into technical specifications and solutions. 

  • Develop and manage SQL queries and Delta Live Tables to support data analytics and reporting needs. 

  • Monitor and optimize the performance of ETL jobs, database queries, and data streams to ensure system reliability and efficiency. 

  • Troubleshoot and resolve issues across data platforms, ensuring minimal downtime and data integrity. 

  • Stay current with emerging technologies and industry best practices in data engineering and recommend improvements to existing systems. 

  • Document data processes, workflows, and system configurations to ensure knowledge sharing and operational efficiency. 

Qualifications: 

  • Bachelor’s degree in Computer Science, Information Technology, or a related field. 

  •  years of experience with Mainframe systems, including COBOL, JCL, and VSAM. 

  •  years of experience with Teradata, including advanced SQL development and performance tuning. 

  • Strong expertise in IBM DataStage, with a proven track record of designing and implementing complex ETL processes. 

  • Proficiency in AWS cloud services and experience with Databricks for building data pipelines. 

  • Solid understanding of Apache Spark, including Spark Streaming, and experience with Kafka for real-time data processing. 

  • Experience with Delta Live Tables and Delta Tables for managing and querying large datasets. 

  • Strong problem-solving skills and the ability to work effectively in a fast-paced, collaborative environment. 

  • Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. 

Preferred Qualifications: 

  • Experience with additional cloud platforms, such as Azure or Google Cloud. 

  • Familiarity with other ETL tools and data integration frameworks. 

  • Experience with message queuing systems such as IBM MQ. 

  • Knowledge of agile development methodologies and version control systems. 

 

Education (if blank, degree and/or field of study not specified)

Degrees/Field of Study required:

Degrees/Field of Study preferred:

Certifications (if blank, certifications not specified)

Required Skills

Optional Skills

Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Automation, Automation Framework Design and Development, Automation Solutions, Budgetary Management, Business Process Automation (BPA), Business Process Improvement, Business Process Outsourcing, Business Transformation, Communication, Continuous Process Improvement, Creativity, Data Quality Automation, Deliverable Planning, Delivery Excellence, Design Automation, Digital Transformation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity {+ 24 more}

Desired Languages (If blank, desired languages not specified)

Travel Requirements

Up to 20%

Available for Work Visa Sponsorship?

No

Government Clearance Required?

No

Job Posting End Date

April 25, 2025

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0

Tags: Agile AWS Azure Computer Science Data Analytics Databricks DataOps Data pipelines Data quality Engineering ETL GCP Google Cloud Kafka Pipelines Spark SQL Streaming Teradata

Perks/benefits: Career development

Region: Asia/Pacific
Country: India

More jobs like this