Python Data Integration Engineer

Bengaluru, India

Apply now Apply later

Company Description

At Intuitive, we are united behind our mission: we believe that minimally invasive care is life-enhancing care. Through ingenuity and intelligent technology, we expand the potential of physicians to heal without constraints.

As a pioneer and market leader in robotic-assisted surgery, we strive to foster an inclusive and diverse team, committed to making a difference. For more than 25 years, we have worked with hospitals and care teams around the world to help solve some of healthcare's hardest challenges and advance what is possible.

Intuitive has been built by the efforts of great people from diverse backgrounds. We believe great ideas can come from anywhere. We strive to foster an inclusive culture built around diversity of thought and mutual respect. We lead with inclusion and empower our team members to do their best work as their most authentic selves.

Passionate people who want to make a difference drive our culture. Our team members are grounded in integrity, have a strong capacity to learn, the energy to get things done, and bring diverse, real world experiences to help us think in new ways. We actively invest in our team members to support their long-term growth so they can continue to advance our mission and achieve their highest potential.

Join a team committed to taking big leaps forward for a global community of healthcare professionals and their patients. Together, let's advance the world of minimally invasive care.

Job Description

re you a seasoned data engineer with a passion for both hands-on technical work? Do you thrive in an environment that values innovation, collaboration, and cutting-edge technologies? We are seeking an experienced Data Integration engineer to spearhead our data integration strategies and initiatives. The ideal candidate will possess deep technical expertise in Python programming, Snowflake data warehousing, AWS cloud services, Kubernetes (EKS), CI/CD methodologies, Apache Airflow, dbt, Kafka for real-time data streaming, and API development. This role is pivotal in driving the architecture, development, and maintenance of scalable and efficient data pipelines and integrations to support our analytics and business intelligence platforms. 

 

Role and Responsibilities: 

As the Data Integration Engineer, you will play a pivotal role in shaping the future of our data integration engineering initiatives. You will remain actively involved in the technical aspects of the projects. Your responsibilities will include: 

  • Hands-On Contribution: Continue to be hands-on with data integration engineering tasks, including data pipeline development, EL processes, and data integration. Be the go-to expert for complex technical challenges. 

  • Integrations Architecture: Design and implement scalable and efficient data integration architectures that meet business requirements. Ensure data integrity, quality, scalability, and security throughout the pipeline. 

  • Tool Proficiency: Leverage your expertise in Snowflake, SQL, Apache Airflow, AWS, API, CI/CD, DBT and Python to architect, develop, and optimize data solutions. Stay current with emerging technologies and industry best practices. 

  • Data Quality: Monitor data quality and integrity, implementing data governance policies as needed. 

  • Cross-Functional Collaboration: Collaborate with data science, data warehousing, analytics, and other cross-functional teams to understand data requirements and deliver actionable insights. 

  • Performance Optimization: Identify and address performance bottlenecks within the data infrastructure. Optimize data pipelines for speed, reliability, and efficiency. 

  • Project Management: Oversee end-to-end project delivery, from requirements gathering to implementation. Ensure projects are delivered on time and within scope. 

 

 

Qualifications

Bachelor's degree in computer science, Engineering, or related field. Advanced degree is a plus. 

4 years of hands-on experience in python programming. 

3 years of experience in data engineering with experience in SQL. 

Preferred Skills: 

Familiarity with cloud platforms, such as AWS or Azure. 

Demonstrated experience in designing and developing RESTful APIs. 

Preferred experience with Snowflake, AWS, Kubernetes (EKS), CI/CD practices, Apache Airflow, and dbt. 

Good experience in full-stack development 

Excellent analytical, problem-solving, and decision-making abilities. 

Strong communication skills, with the ability to articulate technical concepts to non-technical stakeholders. 

A collaborative mindset, with a focus on team success. 

If you are a results-oriented Data Integration Engineer with a strong background in Python programming and SQL, we encourage you to apply. Join us in building data solutions that drive business success and innovation 

 

 

Additional Information

Intuitive is an Equal Employment Opportunity Employer. We provide equal employment opportunities to all qualified applicants and employees, and prohibit discrimination and harassment of any type, without regard to race, sex, pregnancy, sexual orientation, gender identity, national origin, color, age, religion, protected veteran or disability status, genetic information or any other status protected under federal, state, or local applicable laws.

We will consider for employment qualified applicants with arrest and conviction records in accordance with fair chance laws.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow API Development APIs Architecture AWS Azure Business Intelligence CI/CD Computer Science Data governance Data pipelines Data quality Data Warehousing dbt Engineering Kafka Kubernetes Pipelines Python Security Snowflake SQL Streaming

Region: Asia/Pacific
Country: India

More jobs like this