Lead Developer - ETL - NJ

United States

Photon

Photon, a global leader in digital transformation services and IT consulting, works with 40% of the Fortune 100 companies as their digital agency of choice.

View all jobs at Photon

Apply now Apply later

Job Summary:

We are looking for skilled ETL Developers to support a HEDIS refactoring project. This role requires expertise in ETL, big data processing, and cloud-based data platforms to support healthcare analytics. The Data Engineers will work closely with Data Analysts to ensure high-quality, efficient, and scalable data pipelines.

 

Key Responsibilities:

1. Data Pipeline Development & Automation:

  • Develop and maintain ETL pipelines to load healthcare data into the HEDIS data marts
  • Automate data ingestion and transformation workflows using Azure Data Factory (ADF), PySpark, and Databricks.
  • Optimize SQL queries and data processing logic for performance and scalability.

2. Data Modeling & Storage:

  • Design and implement data models to support HEDIS analytics and reporting.
  • Ensure efficient storage and retrieval of Claims, Clinical, Labs, Immunization, and Encounter data.
  • Implement data partitioning, indexing, and caching strategies for performance tuning.

3. Data Quality & Governance:

  • Implement data validation, anomaly detection, and reconciliation processes.
  • Work with the Lead Data Analyst to ensure compliance with HEDIS data quality standards.
  • Develop and maintain data lineage and metadata management frameworks.

4. Collaboration & Support:

  • Work closely with Data Analysts to understand reporting and analytics needs.
  • Provide support for troubleshooting data issues and optimizing query performance.
  • Document ETL workflows, data models, and system configurations.

 

Required Qualifications & Skills:

Technical Expertise:

  • Databases: SQL Server, Databricks
  • ETL & Workflow Automation: SSIS, ADF (Azure Data Factory)
  • Programming: Python, SQL, PySpark
  • Cloud & Storage: Azure, AWS (preferred Azure)

Healthcare Data Knowledge:

  • Exposure to Claims, Clinical, Lab, Immunization, and Encounter data.
  • Understanding of HEDIS quality measures and data submission processes.

Soft Skills:

  • Strong problem-solving and troubleshooting skills.
  • Ability to collaborate with cross-functional teams and work independently.
  • Excellent written and verbal communication skills.

 

Preferred Qualifications:

  • Experience Datawarehouse, ETL/ELT, Data Modeling
  • Exposure to FHIR, HL7, and healthcare interoperability standards.

Compensation, Benefits and Duration

Minimum Compensation: USD 46,000
Maximum Compensation: USD 162,000
Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role.
Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees.
This position is available for independent contractors
No applications will be considered if received more than 120 days after the date of this post

 

Apply now Apply later
Job stats:  0  0  0

Tags: AWS Azure Big Data Databricks Data pipelines Data quality ELT ETL HL7 Pipelines PySpark Python SQL SSIS

Perks/benefits: 401(k) matching Health care

Region: North America
Country: United States

More jobs like this