Sr Data Analytics & Full stack Engineer
Bengaluru, INDIA, India
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
Intuitive
Company Description
At Intuitive, we are united behind our mission: we believe that minimally invasive care is life-enhancing care. Through ingenuity and intelligent technology, we expand the potential of physicians to heal without constraints.
As a pioneer and market leader in robotic-assisted surgery, we strive to foster an inclusive and diverse team, committed to making a difference. For more than 25 years, we have worked with hospitals and care teams around the world to help solve some of healthcare's hardest challenges and advance what is possible.
Intuitive has been built by the efforts of great people from diverse backgrounds. We believe great ideas can come from anywhere. We strive to foster an inclusive culture built around diversity of thought and mutual respect. We lead with inclusion and empower our team members to do their best work as their most authentic selves.
Passionate people who want to make a difference drive our culture. Our team members are grounded in integrity, have a strong capacity to learn, the energy to get things done, and bring diverse, real world experiences to help us think in new ways. We actively invest in our team members to support their long-term growth so they can continue to advance our mission and achieve their highest potential.
Join a team committed to taking big leaps forward for a global community of healthcare professionals and their patients. Together, let's advance the world of minimally invasive care.
Job Description
Primary Function of Position:
We are seeking for a Sr Data Analytics & Full stack engineer to lead the development of interactive, data-driven applications using Streamlit, integrated with modern data platforms like Snowflake and dbt. This role blends deep analytics engineering capabilities with strong Python full stack development skills to deliver self-service tools that power operations, planning, and decision-making.
This role involves working closely with data analysts, data scientists, and other stakeholders to build tools that simplify access to insights, improve data quality, and accelerate automation.
Role & Responsibilities:
Essential Job Duties:
- Build and maintain high-performance data pipelines using Python, dbt, Airflow, or Spark to transform and model data for analytics consumption.
- Develop curated, version-controlled datasets with clearly defined metrics and business logic.
- Ensure data quality through testing, monitoring, and anomaly detection systems.
- Partner with analysts and stakeholders to translate reporting requirements into data solutions.
- Collaborate with Enterprise Analytics teams to ensure data security and compliance
- with organizational policies.
- Document data processes, systems, and methodologies for internal and external stakeholders.
- Design and develop user-centric web apps using Streamlit to support data exploration, reporting, and workflow automation.
- Build modular, scalable, and reusable components for interactive visualizations, data editing, approval flows, and predictive models.
- Integrate with Snowflake, Smartsheet, S3, and REST APIs to build end-to-end data apps.
- Working hours will overlap with morning PST time zone to allow for hand-off and review meetings with team and stakeholders.
Qualifications
Required Skills and Experience:
- 5+ years of experience in analytics engineering, data engineering, or full stack development
- Proficiency in SQL, data modeling, and working with cloud data warehouses (Snowflake preferred).
- Advanced Python skills and experience building data tools with Streamlit
- Experience with RESTful API development using Python frameworks (Flask, FastAPI)
- Experience with dbt, Airflow, or similar tools for data transformation and orchestration.
- Strong understanding of Git, CI/CD workflows, and containerization with Docker.
- Ability to work in a customer-facing role, with proven verbal and written communication skills, to effectively communicate with technical and non- technical stakeholders.
Preferred Qualifications:
- Knowledge of data visualization tools like Tableau or Power BI.
- Familiarity with Streamlit cloud deployment, streamlit-authenticator, and multi-user session handling.
- Background in healthcare manufacturing, supply chain, or operations analytics.
- Understanding of data governance and best practices in data management.
Required Education and Training
- Minimum a bachelor's or master's degree in computer science, information technology, or a related field.
Additional Information
Intuitive is an Equal Employment Opportunity Employer. We provide equal employment opportunities to all qualified applicants and employees, and prohibit discrimination and harassment of any type, without regard to race, sex, pregnancy, sexual orientation, gender identity, national origin, color, age, religion, protected veteran or disability status, genetic information or any other status protected under federal, state, or local applicable laws.
We will consider for employment qualified applicants with arrest and conviction records in accordance with fair chance laws.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow API Development APIs CI/CD Computer Science Data Analytics Data governance Data management Data pipelines Data quality Data visualization dbt Docker Engineering FastAPI Flask Git Pipelines Power BI Python Security Snowflake Spark SQL Streamlit Tableau Testing
Perks/benefits: Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.