EY - GDS Consulting - AI and DATA - Insurance Domain ETL Testing - Manager
Kochi, KL, IN, 682313
EY
Tarjoamme palveluita, jotka auttavat ratkaisemaan asiakkaidemme vaikeimmat haasteetAt EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all.
Job Description for Lead Data Engineer QA
Rank – Manager
Location – Bengaluru/Chennai/Kerela/Kolkata
Objectives and Purpose
- The Lead Data Engineer QA will be responsible for testing business intelligence and data warehouse solutions, both in on-premises and cloud platforms. We are seeking an innovative and talented individual who can create test plans, protocols, and procedures for new software. In addition, you will be supporting build of large-scale data architectures that provide information to downstream systems and business users.
Your key responsibilities
- Design and execute manual and automatic test cases, including validating alignment with ELT data integrity and compliance.
- Support conducting QA test case designs, including identifying opportunities for test automation and developing scripts for automatic processes as needed.
- Follow quality standards, conduct continuous monitoring and improvement, and manage test cases, test data, and defect processes using a risk-based approach as needed.
- Ensure all software releases meet regulatory standards, including requirements for validation, documentation, and traceability, with particular emphasis on data privacy and adherence to infrastructure security best practices.
- Proactively foster strong partnerships across teams and stakeholders to ensure alignment with quality requirements and address any challenges.
- Implement observability within testing processes to proactively identify, track, and resolve quality issues, contributing to sustained high-quality performance.
- Establish methodology to test effectiveness of BI and DWH projects, ELT reports, integration, manual and automation functionality
- Work closely with product team to monitor data quality, integrity, and security throughout the product lifecycle, implementing data quality checks to ensure accuracy, completeness, and consistency.
- Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity.
- Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS native technologies, to support continuing increases in data source, volume, and complexity.
- Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment.
- Establish methodology to test effectiveness of BI and DWH projects, ELT reports, integration, manual and automation functionality.
- Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes.
- Partner with Business Analytics and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives.
- Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling.
- Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth.
- Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions.
To qualify for the role, you must have the following:
Essential skillsets
- Bachelor’s degree in Engineering, Computer Science, Data Warehousing, or related field
- 10+ years of experience in software development, data science, data engineering, ETL, and analytics reporting development
- Understanding of project and test lifecycle, including exposure to CMMi and process improvement frameworks
- Experience designing, building, implementing, and maintaining data and system integrations using dimensional data modelling and development and optimization of ETL pipelines
- Proven track record of designing and implementing complex data solutions
- Understanding of business intelligence concepts, ETL processing, dashboards, and analytics
- Testing experience in Data Quality, ETL, OLAP, or Reports
- Knowledge in Data Transformation Projects, including database design concepts & white box testing
- Experience in cloud based data solution – AWS/Azure
- Demonstrated understanding and experience using:
- Cloud-based data solutions (AWS, IICS, Databricks)
- GXP and regulatory and risk compliance
- Cloud AWS infrastructure testing
- Python data processing
- SQL scripting
- Test processes (e.g., ELT testing, SDLC)
- Power BI/Tableau
- Script (e.g., perl and shell)
- Data Engineering Programming Languages (i.e., Python)
- Distributed Data Technologies (e.g., Pyspark)
- Test Management and Defect Management tools (e.g., HP ALM)
- Cloud platform deployment and tools (e.g., Kubernetes)
- DevOps and continuous integration
- Databricks/ETL
- Understanding of database architecture and administration
- Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases
- Processes high proficiency in code programming languages (e.g., SQL, Python, Pyspark, AWS services) to design, maintain, and optimize data architecture/pipelines that fit business goals
- Strong organizational skills with the ability to manage multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions
- Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners
- Strong problem solving and troubleshooting skills
- Ability to work in a fast-paced environment and adapt to changing business priorities
EY | Building a better working world
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate.
Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture AWS Azure Big Data Business Analytics Business Intelligence Computer Science Consulting Databricks Data pipelines Data quality Data warehouse Data Warehousing DevOps ELT Engineering ETL Kubernetes Machine Learning OLAP Perl Pipelines Power BI Privacy PySpark Python SDLC Security SQL Tableau Testing
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.