Sr. Data Engineer

Bangalore, Karnataka, IN, 560100

Apply now Apply later

 

Work Your Magic with us!  

 

Ready to explore, break barriers, and discover more? We know you’ve got big plans – so do we! Our colleagues across the globe love innovating with science and technology to enrich people’s lives with our solutions in Healthcare, Life Science, and Electronics. Together, we dream big and are passionate about caring for our rich mix of people, customers, patients, and planet. That's why we are always looking for curious minds that see themselves imagining the unimaginable with us.  

 

United As One for Patients, our purpose in Healthcare is to help create, improve and prolong lives. We develop medicines, intelligent devices and innovative technologies in therapeutic areas such as Oncology, Neurology and Fertility. Our teams work together across 6 continents with passion and relentless curiosity in order to help patients at every stage of life. Joining our Healthcare team is becoming part of a diverse, inclusive and flexible working culture, presenting great opportunities for personal development and career advancement across the globe.

 

Job Summary:

We are seeking a highly skilled AWS, Snowflake, and AI Engineer to join our dynamic team. In this role, you will be responsible for designing, implementing, and optimizing data solutions in the cloud using AWS and Snowflake. Additionally, you will develop and integrate AI models to drive innovation and efficiency across various business processes. The ideal candidate is a strategic thinker with hands-on experience in cloud platforms, data engineering, and AI technologies.

Key Responsibilities:

  • AWS Architecture & Management:
    • Design, deploy, and manage cloud infrastructure on AWS.
    • Implement best practices for security, scalability, and cost optimization.
    • Monitor and troubleshoot AWS environments to ensure optimal performance.
  • Data Engineering with Snowflake:
    • Design and implement data pipelines using Snowflake.
    • Develop and maintain ETL processes for data ingestion and transformation.
    • Optimize Snowflake database performance for data analytics and reporting.
    • Integrate Snowflake with other AWS services and third-party tools.
  • AI Model Development & Integration:
    • Design, develop, and deploy AI models to solve business problems.
    • Integrate AI solutions into existing applications and workflows.
    • Collaborate with data scientists and business stakeholders to ensure AI models align with business objectives.
    • Continuously evaluate and improve AI model performance.
  • Collaboration & Stakeholder Management:
    • Work closely with cross-functional teams, including data scientists, analysts, and business units, to understand and address data and AI needs.
    • Provide technical leadership and mentorship to junior engineers.
    • Communicate complex technical concepts to non-technical stakeholders.
  • Documentation & Compliance:
    • Maintain comprehensive documentation of cloud architecture, data pipelines, and AI models.
    • Ensure compliance with industry regulations and company policies related to data security and privacy.

Qualifications:

  • Experience:
    • 7+ years of experience in cloud engineering, with a focus on AWS.
    • 7+ years of experience working with Snowflake and data engineering.
    • 5+ Years of experience in AI/ML model development and deployment.
    • 5+ Years of experience in Python & Pyspark
    • Experience in managing Informatica Cloud (IICS) is a plus.
    • Should have strong expertise in Writing complex SQL queries and Procedures
    • Experience in Coding, testing, implementing and maintaining medium to highly complex ETL code/solutions and scripts to build and maintain automated processes
    • Should have good knowledge in Unix Scripting
    • Should have good understanding and experience of software development standard methodologies
    • Excellent Communication skills
  • Technical Skills:
    • Strong proficiency in AWS services (e.g., EC2, S3, Lambda, RDS, Redshift).
    • Expertise in Snowflake data warehousing, including SnowSQL and SnowPipe.
    • Proficiency in AI/ML frameworks.
    • Strong programming skills in Python, PySpark, SQL
    • Experience with CI/CD pipelines, containerization (Docker, Kubernetes), and version control (Git/Azure DevOps).
    • Knowledge of data governance, security, and compliance best practices.
  • Soft Skills:
    • Excellent problem-solving and analytical skills.
    • Strong communication and interpersonal skills.
    • Ability to work independently and as part of a team.
    • Strong organizational skills and attention to detail.


What we offer: We are curious minds that come from a broad range of backgrounds, perspectives, and life experiences. We celebrate all dimensions of diversity and believe that it drives excellence and innovation, strengthening our ability to lead in science and technology. We are committed to creating access and opportunities for all to develop and grow at your own pace. Join us in building a culture of inclusion and belonging that impacts millions and empowers everyone to work their magic and champion human progress!
 
Apply now and become a part of our diverse team!

 

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture AWS Azure CI/CD Data Analytics Data governance Data pipelines Data Warehousing DevOps Docker EC2 Engineering ETL Git Informatica Kubernetes Lambda Machine Learning ML models Pipelines Privacy PySpark Python Redshift Security Snowflake SQL Testing

Perks/benefits: Career development Flex hours

Region: Asia/Pacific
Country: India

More jobs like this