Data Engineer

Remote

nimble solutions

nimble solutions is a leading provider of billing services and revenue cycle management solutions for ASCs, surgical centers, and anesthesia groups.

View all jobs at nimble solutions

Apply now Apply later

Job Type Full-time Description

Join a leading Revenue Cycle Management (RCM) company dedicated to transforming healthcare data into actionable insights. We leverage cutting-edge technology to streamline financial and operational processes, improving efficiency and patient outcomes. We are looking for a Data Engineer to help optimize data pipelines and build a next-generation data infrastructure incorporating technologies such as Microsoft Fabric, Azure Synapse, Databricks, and Snowflake.

  

Job Summary:


As a Data Engineer, you will play a crucial role in designing, building, and maintaining robust data pipelines and architectures. You will optimize data workflows, ensure scalability, and contribute to the development of a new data infrastructure that integrates with Microsoft Fabric, Azure Synapse, Databricks, Snowflake, and other cloud-based technologies. This role requires expertise in cloud-based data solutions, big data processing, and the ability to collaborate with cross-functional teams to enhance healthcare data analytics and operational efficiency.


Key Responsibilities: 

  • Design, develop, and optimize scalable ETL/ELT data pipelines for healthcare RCM processes
  • Build and maintain a modern data infrastructure incorporating Microsoft Fabric, Azure Synapse, Databricks, Snowflake and other cloud technologies
  • Collaborate with data architects, analysts, and engineering teams to improve data accessibility and performance
  • Ensure data quality, security, and compliance with healthcare regulations (HIPAA, HITRUST)
  • Optimize database performance and implement best practices for data governance and metadata management
  • Work with structured and unstructured data, integrating diverse data sources such as EHR/EMR systems, claims data, and financial records
  • Implement real-time and batch data processing solutions using various cloud data platforms and tools.
  • Support data integration with BI and analytics tools such as Power BI
  • Write and optimize complex SQL queries to transform and analyze large healthcare datasets
  • Mentor junior engineers and contribute to technical best practices
Requirements

  

Required Skills & Experience:

  • 3+ years of experience in data engineering or a related field
  • Expertise in SQL for data processing, transformation, and performance optimization
  • Proficiency in Python or Scala for data engineering workflows
  • Strong knowledge of Azure Data Services, including Microsoft Fabric, Azure Synapse, Databricks, Snowflake, Azure Data Factory, Synapse, and Databricks
  • Experience working with large-scale data architectures in cloud environments
  • Proficiency in ETL/ELT workflows and data pipeline optimization
  • Hands-on experience with healthcare data (e.g., claims, EMR/EHR, HL7, FHIR)
  • Familiarity with data security, compliance, and governance best practices in healthcare
  • Ability to work in an agile, collaborative, and remote environment

Preferred Skills:

  • Experience with Microsoft Fabric, Azure Synapse, Databricks, and Snowflake in a production environment
  • Knowledge of other cloud-based data platforms and integration tools
  • Hands-on experience with Power BI, DAX, and data modeling
  • Experience with machine learning pipelines or predictive analytics in healthcare
  • Previous experience in RCM, insurance, or healthcare analytics
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Architecture Azure Big Data Data Analytics Databricks Data governance Data pipelines Data quality ELT Engineering ETL HL7 Machine Learning Pipelines Power BI Python Scala Security Snowflake SQL Unstructured data

Region: Remote/Anywhere

More jobs like this