Data Engineer

Costa Rica

Komodo Health

Get the most accurate, patient-centric view of the U.S. healthcare system. Scale insight generation with NLP-based AI and an enterprise platform.

View all jobs at Komodo Health

Apply now Apply later

We Breathe Life Into Data

At Komodo Health, our mission is to reduce the global burden of disease. And we believe that smarter use of data is essential to this mission. That’s why we built the Healthcare Map — the industry’s largest, most complete, precise view of the U.S. healthcare system — by combining de-identified, real-world patient data with innovative algorithms and decades of clinical experience. The Healthcare Map serves as our foundation for a powerful suite of software applications, helping us answer healthcare’s most complex questions for our partners. Across the healthcare ecosystem, we’re helping our clients unlock critical insights to track detailed patient behaviors and treatment patterns, identify gaps in care, address unmet patient needs, and reduce the global burden of disease. 

As we pursue these goals, it remains essential to us that we stay grounded in our values: be awesome, seek growth, deliver “wow,” and enjoy the ride. At Komodo, you will be joining a team of ambitious, supportive Dragons with diverse backgrounds but a shared passion to deliver on our mission to reduce the burden of disease — and enjoy the journey along the way.

The Opportunity at Komodo Health

Komodo Health leverages the latest data engineering technology such as Spark, Airflow, and Snowflake to tackle some of healthcare’s biggest challenges by transforming extraordinary amounts of data into rich and meaningful insights.

As a Data Engineer on Komodo Health’s team, you will be solving complex data challenges while helping build and scale our data platform that powers state-of-the-art interactive product experiences. You will enable smarter, more innovative uses of healthcare datasets by designing robust data pipelines and implementing data best practices.

This role will collaborate with Product Managers, Customer Success and Sales to understand data requirements and to develop the set of necessary data processing steps for creating said data products and insights.

This role will be a key contributor to the scalability of the data systems across our suite of product offerings, including the design and implementation of data pipelines. This person will be a part of a team delivering data products and insights to our customers either via our suite of product applications or any other delivery mechanisms.

Looking back on your first 12 months at Komodo Health, you will have…

  • Gained an understanding of the broader Komodo Health data landscape, Sentinel and MapLab platform.
  • Built and improved the foundational pieces of Sentinel data infrastructure, pipelines, and data services 
  • Led and implemented new data product offerings on Sentinel platform with complex business requirements across different stakeholders including engineering, product and customers.
  • Designed and implemented data and system migrations for existing customers to new platform
  • Ensured non-functional requirements are met, such as costs, developer experience, reliability, maintainability and operations/support.

You will accomplish these outcomes through the following responsibilities…

  • Partnering with Engineering team members, Product Managers, and customer-facing teams to understand complex health data use cases and business logic
  • Being curious about our data
  • Building foundational pieces of our data platform architecture, pipelines, analytics, and services underlying our platform 
  • Designing and developing reliable data pipelines that transform data at scale, orchestrated jobs via Airflow/Temporal, using SQL and Python in Snowflake
  • Contributing to python packages in Github and APIs, using current best practices

What you bring to Komodo Health:

  • Demonstrated proficiency in designing and developing with distributed data processing platforms like Spark and pipeline orchestration tools like Airflow and/or Temporal
  • Experience with modern data warehouses such as Snowflake; Experience with SQL and query design on large, complex datasets
  • Solid computer science skills and proficiency in programming languages like Python. Able to leverage industry standard engineering best practices, like design patterns and/or testing
  • Capable of quickly building expertise on an as-need basis on a new tech stack
  • Experience with product engineering software development, in an agile environment
  • Demonstrated track-record of delivering products and features with varying degrees of complexity, and through several iterations of product development
  • Understand and design for non-functional concerns such as performance, cost optimization, maintainability and developer experience
  • A thirst for knowledge, willingness to learn, and a growth-oriented mindset
  • Excellent cross-team communication and collaboration skills

Additional skills and experience we’d prioritize (nice to have)…

  • Experience in building containerized API services to serve both internal and external clients
  • Experience enhancing CI/CD build tooling in a containerized environment, from deployment pipelines (Jenkins, etc), infrastructure as code (Terraform, Cloudformation), and configuration management via Docker and Kubernetes
  • US health care data experience is not required but it is a strong plus

#LI-Remote

Compensation at Komodo Health

The pay range for each job posting reflects a minimum and maximum range of pay that we reasonably expect to pay across all U.S. locations and may span more than one career level. We carefully consider multiple business-related factors when determining compensation, including job-related skills, work experience, geographic work location, relevant training and certifications, business needs and market demands.

The U.S. national starting annual base pay for this role is listed below. This position may be eligible for performance-based bonuses as determined in the Company’s sole discretion and in accordance with a written agreement or plan.₡25.600.000—₡38.400.000 CRC

Where You’ll Work

Komodo Health has a hybrid work model; we recognize the power of choice and importance of flexibility for the well-being of both our company and our individual Dragons. Roles may be completely remote based anywhere in the country listed, remote but based in a specific region, or local (commuting distance) to one of our hubs in San Francisco, New York City, or Chicago with remote work options. 

What We Offer

This position will be eligible for company benefits in accordance with Company policy. We offer a competitive total rewards package including medical, dental and vision coverage along with a broad range of supplemental benefits including 401k Retirement Plan, prepaid legal assistance, and more. We also offer paid time off for vacation, sickness, holiday, and bereavement. We are pleased to be able to provide 100% company-paid life insurance and long-term disability insurance. This information is intended to be a general overview and may be modified by the Company due to business-related factors.

Equal Opportunity Statement

Komodo Health provides equal employment opportunities to all applicants and employees. We prohibit discrimination and harassment of any type with regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Airflow APIs Architecture CI/CD CloudFormation Computer Science Data pipelines Docker Engineering GitHub Jenkins Kubernetes Pipelines Python Snowflake Spark SQL Terraform Testing

Perks/benefits: 401(k) matching Career development Competitive pay Health care Insurance Salary bonus

Regions: Remote/Anywhere North America
Country: Costa Rica

More jobs like this