AWS Data Engineer

hyderabad, India

DATAECONOMY

Enabling Businesses to Monetize Data at Data Speeds with cutting edge Technology Services and Solutions. Big Data Management, Cloud enablement, Data Science, etc..

View all jobs at DATAECONOMY

Apply now Apply later

We are seeking a highly skilled and experienced Senior Data Engineer to lead the end-to-end development of complex models for compliance and supervision. The ideal candidate will have deep expertise in cloud-based infrastructure, ETL pipeline development, and financial domains, with a strong focus on creating robust, scalable, and efficient solutions.

 

Key Responsibilities:

      •     Model Development: Lead the development of advanced models using AWS services such as EMR, Glue, and Glue Notebooks.

      •     Cloud Infrastructure: Design, build, and optimize scalable cloud infrastructure solutions with a minimum of 5 years of experience.

      •     ETL Pipeline Development: Create, manage, and optimize ETL pipelines using PySpark for large-scale data processing.

      •     CI/CD Implementation: Build and maintain CI/CD pipelines for deploying and maintaining cloud-based applications.

      •     Data Analysis: Perform detailed data analysis and deliver actionable insights to stakeholders.

      •     Collaboration: Work closely with cross-functional teams to understand requirements, present solutions, and ensure alignment with                         business goals.

      •     Agile Methodology: Operate effectively in agile or hybrid agile environments, delivering high-quality results within tight deadlines.

      •     Framework Development: Enhance and expand existing frameworks and capabilities to support evolving business needs.

      •     Documentation and Communication: Create clear documentation and present technical solutions to both technical and non-technical                         audiences.

 



Requirements

Required Qualifications:

      •     05+ years of experience with Python programming.

      •     5+ years of experience in cloud infrastructure, particularly AWS.

      •     3+ years of experience with PySpark, including usage with EMR or Glue Notebooks.

      •     3+ years of experience with Apache Airflow for workflow orchestration.

      •     Solid experience with data analysis in fast-paced environments.

      •     Strong understanding of capital markets, financial systems, or prior experience in the financial domain is a must.

      •     Proficiency with cloud-native technologies and frameworks.

      •     Familiarity with CI/CD practices and tools like Jenkins, GitLab CI/CD, or AWS CodePipeline.

      •     Experience with notebooks (e.g., Jupyter, Glue Notebooks) for interactive development.

      •     Excellent problem-solving skills and ability to handle complex technical challenges.

      •     Strong communication and interpersonal skills for collaboration across teams and presenting solutions to diverse audiences.

      •     Ability to thrive in a fast-paced, dynamic environment.

 







Benefits

Standard Company Benefits
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow AWS CI/CD Data analysis ETL GitLab Jenkins Jupyter ML models Pipelines PySpark Python

Region: Asia/Pacific
Country: India

More jobs like this