Senior Data Engineer

Chicago, Illinois, United States

PatientPoint

PatientPoint is a leading digital health company that connects patients, healthcare providers and life sciences with health information at key moments of care.

View all jobs at PatientPoint

It is an exciting time to be part of the PatientPoint team! As the clear leader in the point-of-care industry, we offer an ideal, people-focused place to innovate, positively impact patient education and doctor-patient connections, and be inspired to build a great career. 

Location: Chicago

Hybrid Schedule: 1-3 days in Office Weekly

Travel Requirements: 4-5 weeks per year 

Job Summary  
We are a small team in a rapidly developing space, our culture is hugely important to us, and is an area where you can make a great impact.  Our success is driven by collaboration - we solve problems together, and our openness helps each individual grow, too. We're looking for people who bring innovative approaches to their work, excitement for looking at things from new angles and a spirit of continuous improvement contributing fresh insights and the tenacity required to deliver value, and of course, who love to have fun! 

You will report to the Director of Data and Analytics Engineering. As a Senior Data Engineer, you will be expected to be hands-on using your in-depth knowledge of data pipelines, DAGs, monitoring and testing to design, develop, and support our data pipeline.  You will be responsible for ensuring that design and implementation adhere to DnA engineering standards, are secure, resilient and reliable, enabling our DnA team to deliver rapid impactful benefits with our data product to our business partners..  

What You’ll Do  

  • As a Senior Data Engineer, you will be responsible for the design, orchestration, monitoring, quality, accuracy, and security of data throughout its lifecycle, including data ingestion, collection, storage, processing, and analysis primarily using Snowflake, Fivetran and Astronomer. 
  • Work closely with senior management, product owners, DnA Architects and Engineers, experienced data scientists, software developers and analysts to understand their data requirements and help build robust and efficient data pipelines to support their work. 
  • Lead design, development, prototyping, operations and implementation of data pipelines 
  • Educate, mentor and support DnA team members new to integrating the modern stack. 
  • Build PatientPoint data expertise and own data quality for the pipelines you create. 
  • Testing and release process of data pipelines using best practices for frequent releases. 
  • Participate in Code Reviews 
  • Partner to deliver a modern data engineering model that follows Dev/Ops principles and standards for continuous integration/ continuous delivery (CI/CD) processes. 
  • Develop and maintain data documentation, including data dictionaries, data lineage, and data flow diagrams, to provide clear visibility into the data ecosystem. 
  • Analyze the impact of changes to downstream systems/products and recommend alternatives to minimize the impact. 
  • Drive the migration to a modern data stack in which Analysts and Engineers can self-service changes in an automated, tested, and high-quality manner. 
  • Stay up-to-date with emerging trends and technologies in data engineering and recommend improvements to existing systems. 

What We Need  

  • 5+ years of experience working on cloud data warehouses and data pipelines with a focus on data engineering, building scalable and secure data platforms and systems powering intelligent applications. 
  • Bachelor's Degree in Computer Science or a related field  
  • Competency with Python, Airflow, GitHub and DAG construction 
  • Advanced SQL query and procedure writing experience  
  • Experience with unstructured datasets and ability to handle Avro, Parquet, JSON and XML file formats 
  • Strong understanding of CI/CD principles, DevOps practices, software testing and quality 
  • Experience working with cloud-based data engineering and storage technologies such as AWS and orchestration tools such as Apache Airflow Astronomer. 

Desired Qualifications  

  • Experience working with large data sets and streaming data 
  • Experience with Snowflake 
  • Excellent problem-solving skills and attention to detail
  • Experience protecting PPI or PHI data during the ELT process, data security and data access controls and design  
  • Streaming data; high volume IoT data ingest 
  • Experience with various patterns of data ingestion, processing, and curation along with various streaming data concepts, such as Kafka.
  • Exposure to industry standard BI tools like Power BI and Tableau  
  • Experience implementing data quality initiatives such as test harnesses, monitoring, and auditing features on data 
  • Healthcare / Medical Devices domain experience 

What You'll Need to Succeed  

  • Effective Communication Skills both written and verbal within all levels of the organization
  • Solid work ethic with the ability to work productively with a distributed team.  
  • Self-driven execution capabilities; self-motivated to independently research latest technologies, needs minimal guidance to deliver on commitments. 
  • Creative, Innovative and Solution-oriented mindset along with a positive attitude 
  • Comfort presenting complex, technical topics in an actionable manner to technical and non-technical team members within our organization 
  • Ability to learn quickly and pick up new skills/concepts. 
  • Comfortable with delivering in an ever-changing environment. 
  • Strategic thinking and passion for business strategy and business processes 
  • Seeking a team member who embraces our core values of Integrity, Customer Focus, Innovation, Teamwork, and Coachability.   
  • Collaborative, communicative, respectful, you help others meet their goals, cultivate trust, lead with respect, and practice self-awareness. 
  • Proven experience delivering within a distributed team.
  • Willingness to participate in daily scrum meetings; collaborate with other agile squads and work in a matrix environment.
  • Drive Results: pushes self and others to exceed goals and achieve breakthrough results. Demonstrates persistence in removing barriers to achieving results and encourages others to do the same.

#LI-Hybrid

What We Offer
We know you bring your whole self to work every day. That is why we are committed to providing modernized benefits and cultural perks to our teammates. We offer competitive compensation, comprehensive and affordable benefits, flex time off to rest and charge, where applicable, a hybrid work model, mental & emotional wellness resources and coaching, 401K and more.

About PatientPoint
PatientPoint® is the patient engagement platform for every point of care. Our innovative, tech-enabled solutions create more effective doctor-patient interactions and deliver high value for patients, providers and healthcare sponsors. Through our nearly 140k unique healthcare provider relationships, PatientPoint’s solutions impact roughly 750 million patient visits each year, further advancing our mission of making every doctor-patient engagement better®. Learn more at patientpoint.com

PatientPoint recognizes that privacy is important to you. Please read the PatientPoint privacy policy, we want you to be familiar with how we may collect, use, and disclose your information. Employer is EOE/M/F/D/V

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  6  0  0
Category: Engineering Jobs

Tags: Agile Airflow Avro AWS CI/CD Computer Science Data pipelines Data quality DevOps ELT Engineering FiveTran GitHub JSON Kafka Parquet Pipelines Power BI Privacy Prototyping Python Research Scrum Security Snowflake SQL Streaming Tableau Testing XML

Perks/benefits: Career development Competitive pay Wellness

Region: North America
Country: United States

More jobs like this