Cloud Data Engineer - Healthcare

Remote

Arkatechture

Arkatechture is a data & analytics consulting firm in Portland, ME that provides the expertise, tools, and training needed for true business intelligence.

View all jobs at Arkatechture

Cloud Data Engineer - Healthcare

Department: Professional Services

Employment Type: Full Time

Location: Remote


Description

Why Work Here?At Arkatechture, we have a simple shared mission: to build a sustainable organization built upon three pillars:  Do something meaningful, With a great team, Earning what you deserve.
We started in 2012 with a passion for data, business, and getting things done. We are a team of data lovers and technical experts who use our skills to help businesses big and small harness, utilize, and optimize their data. As New England’s Data Resource, we are a small company constantly evolving to keep up with changing landscapes in the data world. 
We are proud of the community and culture that we’ve created at Arkatechture, and we have no intention of slowing down. We offer a competitive benefits package that includes: 
  • A flexible work-from-home policy (work 100% remotely!)
  • Open-concept offices in Portland, ME with an easy-going dress code, and fresh pots and pops all day (that’s coffee and popcorn!)
  • Training & certificate reimbursement
  • A competitive benefits package that includes medical, disability, life insurance and optional dental/vision
  • 401K Retirement planning with company matching
  • Generous paid time off and eleven paid holidays
  • Employee recognition through milestone awards including annual PTO increases and a 4 day work-week at 3 years of service!
All employees share our core values: put the team first, practice humility, take pride in everything we do, stay curious, care for our community & environment, take work seriously; ourselves not so much.
The Position  As a Cloud Data Engineer specializing in Healthcare, you will be a key member of our engineering team, responsible for architecting, developing, and optimizing publish/subscribe messaging systems with a specific focus on Amazon Web Services (AWS) services. You will collaborate with cross-functional teams to design and implement robust publish/subscribe solutions that leverage a bi-directional data exchange architecture using services such as AWS messaging services, Apache Kafka, and other FedRamp-authorized Cloud Service Offerings along with industry-standard protocols, enabling efficient and real-time data exchange using a hub-and-spoke framework within our software ecosystem.
How to Apply
Please send a cover letter and resume with your application. You must have 3+ years of experience working for a Medicaid agency and you must submit all requested documents to be considered for the position.

Key Responsibilities

  • Experience with designing and implementing highly secure FedRamp-Authorized Cloud Service Offerings (CSO) such as AWS GovCloud, Amazon MSK, Okta, Snowflake, etc.
  • Experience designing and building data pipelines using AWS services 
  • Experience with developing bidirectional data exchange systems
  • SQL/Python development experience especially for serverless computing and event-based triggers
  • Developing and Testing of code
  • Working with Senior Data engineers on the team for a full end-to-end delivery of projects/solutions
  • Communicate with both technical and non-technical collaborators
  • Follow Engineering best practices
  • Status reporting to Team lead on a regular cadence
  • Estimation and working with Project manager on task allocation
  • Additional responsibilities as assigned

Skills, Knowledge and Expertise

Minimum Qualifications
  • 3+ years of experience working for a Medicaid agency as an employee or as a consultant working with Medicaid systems and data
  • 3+ years of experience in a similar individual contributor role
  • Bachelor's degree in a related field or comparable work experience
  • Excellent SQL skills and understands implementation of conformed data models
  • Experience working on Snowflake along with 2 additional databases such as SQL Server, Oracle, Aurora, PostgreSQL, Redshift, MySQL etc
  • Experience with developing in Python; JavaScript is a nice to have
  • Experience working on Data Management projects for Data Lakes/Data Warehousing
  • Experience working with APIs, specifically REST APIs, SDKs and CLI tools as part of ETL/ELT provisioning
  • Experience working with multiple file formats such as JSON, XML, CSV, Flat, etc
  • Experience extracting data from databases using ODBC/JDBC
  • Strong understanding of Microservices architecture
  • A strong understanding of Agile software development life cycle and methodology

Preferred Experience
  • One or more of the following certifications:
    • AWS Solutions Architect Associate/Professional
    • AWS Developer Associate/Professional
    • Snowflake SnowPro Core/Advanced
  • Domain expertise in Healthcare and one or more of the following verticals: Financial Services, Retail, Telco, Digital Marketing, Supply Chain, or Transportation

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  15  2  0
Category: Engineering Jobs

Tags: Agile APIs Architecture AWS CSV Data management Data pipelines Data Warehousing ELT Engineering ETL JavaScript JSON Kafka Microservices MySQL Oracle Pipelines PostgreSQL Python Redshift SDLC Snowflake SQL Testing XML

Perks/benefits: Career development Flex hours Flex vacation Health care Insurance

Region: Remote/Anywhere

More jobs like this