Senior Data Operations Engineer
IND-MH Pune
Medtronic
Medtronic ist ein weltweiter Marktführer in Gesundheitstechnologie und den damit verbundenen Dienstleistungen und Lösungen. Wir arbeiten mit unseren Partnern zusammen, um gemeinsam den gewaltigen Herausforderungen des Gesundheitswesens zu...At Medtronic you can begin a life-long career of exploration and innovation, while helping champion healthcare access and equity for all. You’ll lead with purpose, breaking down barriers to innovation in a more connected, compassionate world.
A Day in the LifeOur Global Diabetes Capability Center in Pune is expanding to serve more people living with diabetes globally. Our state-of-the-art facility is dedicated to transforming diabetes management through innovative solutions and technologies that reduce the burden of living with diabetes.We’re a mission-driven leader in medical technology and solutions with a legacy of integrity and innovation, join our new Minimed India Hub as Senior Data Operations Engineer.
Responsibilities may include the following and other duties may be assigned:
Able to manage large projects or processes that span across other collaborative teams both within and beyond Digital Technology.
Develop and maintain robust, scalable data pipelines using GitHub, AWS, Databricks, Azure, etc.
Develop and optimize ETL/ELT processes to ensure efficient data flow between various systems and platforms.
Implement CICD pipelines for data workflows, ensuring seamless integration and deployment of data solutions.
Automate data quality checks, monitoring, and alerting systems to maintain data integrity and reliability.
Collaborate with Data Scientists, Data Engineers and other stakeholders to understand the data requirements and implement appropriate solutions.
Optimize data storage and processing for cost-effectiveness and performance across Cloud platforms.
Implement and maintain data security and compliance measures across all platforms.
Implement and manage automated workflows using GitHub Actions for code integration, testing, and deployment of data pipelines and related tools.
Design and maintain GitLab CICD pipelines to automate, build, test, and deployment processes for data engineering projects, ensuring consistency across environments.
Operate autonomously to defines, describe, diagram and document the role and interaction of the high-level technological and human components that combine to provide cost effective and innovative solutions to meet evolving business needs.
Promotes, guides and governs good architectural practice through the application of well-defined, proven technology and human interaction patterns and through architecture mentorship.
Strong aptitude for problem solving
Required Knowledge and Experience:
6+ years of experience in DevOps or DataOps roles, with a focus on data pipeline automation.
Strong proficiency in at least one Scripting language (e.g., Python, Bash) and one infrastructure-as-code tool (e.g., Terraform, CloudFormation)
Extensive experience with AWS Services such as S3, RDS, EC2, Lambda, SNS, SQS, Glue, Redshift, Kinesis, MSK, CloudWatch.
Experience with container orchestration platforms (e.g., Kubernetes, ECS) and CICD tools (e.g., GitHub, GitHub Actions, GitLab, GitLab CI).
Few years’ experiences working with Databricks, including Delta Lake, spark, and MLflow.
Familiarity with data governance and compliance requirements (e.g., GDPR)
Excellent problem-solving skills and ability to optimize complex data workflows.
Experience with real-time data operation technologies (e.g., Kafka, Kinesis).
Knowledge of machine learning operations (MLOps) and experience integrating ML models into data pipelines.
Familiarity with data visualization tools (e.g., Power BI) and their integration with data platform.
Certifications in relevant cloud platforms (AWS Certified DevOps Engineer, Azure DevOps Engineer, Databricks Certified Engineer).
Experience with graph databases and data lineage tools.
Contribution to open-source projects or data engineering communities.
Physical Job Requirements
The above statements are intended to describe the general nature and level of work being performed by employees assigned to this position, but they are not an exhaustive list of all the required responsibilities and skills of this position.
Medtronic offers a competitive Salary and flexible Benefits Package
A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create. We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage.
We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions.
Our Mission — to alleviate pain, restore health, and extend life — unites a global team of 90,000+ passionate people.
We are engineers at heart— putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary.
Learn more about our business, mission, and our commitment to diversity here
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture AWS Azure CloudFormation Databricks Data governance DataOps Data pipelines Data quality Data visualization DevOps EC2 ECS ELT Engineering ETL GitHub GitLab Healthcare technology Kafka Kinesis Kubernetes Lambda Machine Learning MLFlow ML models MLOps Open Source Pipelines Power BI Python R R&D Redshift Security Spark Terraform Testing
Perks/benefits: Career development Competitive pay Equity / stock options Flex hours Health care
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.