TechOps-DE-CloudOps AMS-AWS DataOps Engineer-Senior
Noida, UP, IN, 201301
EY
Tarjoamme palveluita, jotka auttavat ratkaisemaan asiakkaidemme vaikeimmat haasteetAt EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all.
The opportunity
We are looking for a seasoned and strategic-thinking Senior AWS DataOps Engineer to join our growing global data team. In this role, you will take ownership of critical data workflows and work closely with cross-functional teams to support, optimize, and scale cloud-based data pipelines. You will bring leadership to data operations, contribute to architectural decisions, and help ensure the integrity, availability, and performance of our AWS data infrastructure.
Your key responsibilities
- Lead the design, monitoring, and optimization of AWS-based data pipelines using services like AWS Glue, EMR, Lambda, and Amazon S3.
- Oversee and enhance complex ETL workflows involving IICS (Informatica Intelligent Cloud Services), Databricks, and native AWS tools.
- Collaborate with data engineering and analytics teams to streamline ingestion into Amazon Redshift and lead data validation strategies.
- Manage job orchestration using Apache Airflow, AWS Data Pipeline, or equivalent tools, ensuring SLA adherence.
- Guide SQL query optimization across Redshift and other AWS databases for analytics and operational use cases.
- Perform root cause analysis of critical failures, mentor junior staff on best practices, and implement preventive measures.
- Lead deployment activities through robust CI/CD pipelines, applying DevOps principles and automation.
- Own the creation and governance of SOPs, runbooks, and technical documentation for data operations.
- Partner with vendors, security, and infrastructure teams to ensure compliance, scalability, and cost-effective architecture.
Skills and attributes for success
- Expertise in AWS data services and ability to lead architectural discussions.
- Analytical thinker with the ability to design and optimize end-to-end data workflows.
- Excellent debugging and incident resolution skills in large-scale data environments.
- Strong leadership and mentoring capabilities, with clear communication across business and technical teams.
- A growth mindset with a passion for building reliable, scalable data systems.
- Proven ability to manage priorities and navigate ambiguity in a fast-paced environment.
To qualify for the role, you must have
- 5–8 years of experience in DataOps, Data Engineering, or related roles.
- Strong hands-on expertise in Databricks.
- Deep understanding of ETL pipelines and modern data integration patterns.
- Proven experience with Amazon S3, EMR, Glue, Lambda, and Amazon Redshift in production environments.
- Experience in Airflow or AWS Data Pipeline for orchestration and scheduling.
- Advanced knowledge of IICS or similar ETL tools for data transformation and automation.
- SQL skills with emphasis on performance tuning, complex joins, and window functions.
Technologies and Tools
Must haves
- Proficient in Amazon S3, EMR (Elastic MapReduce), AWS Glue, and Lambda
- Expert in Databricks – ability to develop, optimize, and troubleshoot advanced notebooks
- Strong experience with Amazon Redshift for scalable data warehousing and analytics
- Solid understanding of orchestration tools like Apache Airflow or AWS Data Pipeline
- Hands-on with IICS (Informatica Intelligent Cloud Services) or comparable ETL platforms
Good to have
- Exposure to Power BI or Tableau for data visualization
- Familiarity with CDI, Informatica, or other enterprise-grade data integration platforms
- Understanding of DevOps and CI/CD automation tools for data engineering workflows
- SQL familiarity across large datasets and distributed databases
What we look for
- Enthusiastic learners with a passion for data op’s and practices.
- Problem solvers with a proactive approach to troubleshooting and optimization.
- Team players who can collaborate effectively in a remote or hybrid work environment.
- Detail-oriented professionals with strong documentation skills.
What we offer
EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career.
- Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next.
- Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way.
- Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs.
- Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs.
EY | Building a better working world
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate.
Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS AWS Glue CI/CD Consulting Databricks DataOps Data pipelines Data visualization Data Warehousing DevOps Engineering ETL Informatica Lambda Pipelines Power BI Redshift Security SQL Tableau
Perks/benefits: Career development Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.