Data Engineer II

Baltimore, Maryland, United States; Lehi, Utah, United States

Medifast

Medifast, Inc is your go to source for quality weight loss products, programs, and support. Start your journey to a healthier you today.

View all jobs at Medifast

Apply now Apply later

About the Opportunity

At Medifast, our team members are relentless in our mission of driving Lifelong Transformation, Making a Healthy Lifestyle Second Nature™. When you join Medifast, you become part of a dynamic, fast-growing community of highly motivated, like-hearted people who share a passion for promoting health and wellness. Just as OPTAVIA Coaches inspire Clients to reach their personal wellness goals, at Medifast, we inspire each other to bring our best to work each day to further our shared mission. If you want to build a rewarding career that makes lives better on a daily basis, Medifast may be the perfect place for you.

Overview of Position

Data Integration Engineer will work closely with cross functional team of product managers, solution architects, engineers, and integrators to enable data org; while key responsibilities include directing code design and delivery tasks associated with the integration, cleaning, transformation, and control of data in operational and analytical data systems; working with the Project Management team to define outcomes and inform work structures. The Data Engineer role will be responsible for developing and implement a range of analytics and automation solutions, including data extraction, analysis, reporting, and dashboard design, but could include more advanced analytics including statistical analysis, text mining & NLP, and modeling/machine learning/AI, as required

II. Job Responsibilities

Analysis & Planning:

  • Analyze various data sources, defining & validating data objects and identifying the relationship among them.

  • Assembles large, complex data sets that meet functional and non-functional requirements, ensuring that the design and engineering approach is consistent across multiple systems

  • Leads the identification of gaps in data management standards adherence and works with appropriate partners to develop plans to close gaps, leading concept testing and conducting research to prototype toolsets and improve existing processes

Design & Documentation

  • Utilizes appropriate architectural components in the design and development of client requirements and collaborates with development teams to understand data requirements and ensure the data architecture is feasible to implement
  • Proficient with design and ticketing tools like Lucid Chart, Visio, Power Point, Jira, Confluence etc.

Tools & Platforms

  • Defines and builds data pipelines to enable data-informed decision making, ensuring adherence to release processes and risk management routines.

  • Maintains, improves, cleans, and manipulates large data for operational and analytics data systems, builds complex processes supporting data transformation, data structures, metadata, data quality controls, dependency, and workload management, and communicates required information for deployment, maintenance, and support of business functionality

  • Experience and knowledge of data governance with data catalogs, data quality and data accessibility are much desirable

  • Responsible for multiple projects simultaneously, ensuring each one is completed on time and efficiently with a high standard for quality. Collaborate with agile team in the planning, design and development, of new Integrations, to support data analytics and data science team.

  • Self-driven and responsible for successful delivery of project from initiation to execution.

Leadership

  • Mentors offshore team of data engineers in the delivery and release of continuous integration and continuous delivery events and defines key performance indicators and internal controls

  • Coordinates, schedules, participate in cross-functional analytics & automation activities

III. Scope

Data engineering needs across business functions. Managing geographically dispersed internal/external team members across remote locations and Baltimore (HQ), and Utah.

IV. Knowledge, Education, Skills & Abilities

  • Bachelor’s degree in Computer Science, Information Systems, or Engineering.
  • Proficient using AWS Services for data integrations and governance like – building data lakes with Amazon S3 and Glue, data integration using AWS Lambda, Event bridge, Kinesis streams etc., using data analytics services like Athena, Redshift, EMR or on other cloud platform services
  • 4+ years in a data engineering role with demonstrable experience in data integration and data warehouse projects using some of the leading ETL Tools like Databricks, Snowflake etc.
  • Experienced to comprehend and transform different data formats – CSV, XML, JSON, SOAP, JMS
  • Excellent analytical, problem solving and conceptual thinking skills
  • Solid understanding of performance tuning concepts for relational and distributed database systems
  • Strong understanding of the full lifecycle of data engineering, through maintenance and monitoring
  • 3+ years of experience programming in Python
  • 3+ years of experience programming in SQL
  • 5+ years in data engineering focused on data enrichment, data integration and data warehouse projects 
  • 3+ years building data analytic pipelines to enable artificial intelligence or machine learning efforts
  • 3+ years writing artificial intelligence software
  • 3+ years writing software to build and data pipelines between relational databases, NoSQL databases, API’s, flat files, and external sources
  • 3+ years of AWS experience in services like RedShift, Amazon Redshift Spectrum, AWS Glue, AWS DMS, Kafka, etc. 
  • 1-3 years of Linux experience
  • 3+ years of experience with Relational DB / NoSQL experience (PostgreSQL and Amazon Redshift or similar) with specific implementation on AWS cloud
  • 3+ yrs experience with different DBMS like Oracle, SQL Server, MySQL, etc
  • Exposure to open source and proprietary cloud data pipeline tools such as Airflow, Glue and Dataflow
  • Experience with data serialization languages such as JSON, XML, YAML
  • Hands-on experience with Transactional and Dimensional data modeling methodologies (Eg: Normalization, Star Schema, Snowflake Schema)
  • 3+ yrs experience with data mapping, data transformation and handling different data formats – CSV, XML, JSON, JMS

At Medifast, Relationships Are At The Center Of What We Do! 

We thrive by elevating our connections with one another as well as with our Coaches & Clients. We believe that everyone has the potential to be OUTSTANDING. The Medifast culture is built on seven core values: integrity, courage, teaming, accountability, empowerment, partnership and diversity. These values aren’t just words on a page – they are celebrated as a core part of the company’s philosophy.

We Lead By…

Mastering Relationships: We build trust, promote collaboration and we are reliable.

Being Innovative: We strive to improve things in our areas of influence; test, refine and expand within the business strategy; and reach beyond real and perceived boundaries.

Simplifying: We are committed to making things measurable, repeatable and scalable; focusing on outcomes not activities; and eliminating complexity to increase focus.

Anticipating: We predict long-term business and organizational needs; challenge assumptions; and expect and prepare for the unexpected.

More About Medifast

Medifast (NYSE: MED) is the health and wellness company known for its habit-based and coach-guided lifestyle solution OPTAVIA®, which provides people with a simple yet comprehensive approach to address obesity and support a healthy lifestyle. OPTAVIA's holistic solution includes lifestyle plans with clinically proven health benefits, scientifically developed products, and a framework for habit creation – all reinforced by independent coach support for customers on their weight loss journeys. Through its collaboration with national virtual primary care provider LifeMD® (Nasdaq: LFMD) and its affiliated medical group, the holistic solution now includes access to GLP-1 medications where clinically appropriate. Visit the OPTAVIA and Medifast websites for more information and follow Medifast on X and LinkedIn.

Thank you for taking the time to learn more about Medifast.

#LI-CB1

high111

Apply now Apply later
Job stats:  2  2  0
Category: Engineering Jobs

Tags: Agile Airflow APIs Architecture Athena AWS AWS Glue Computer Science Confluence CSV Data Analytics Databricks Dataflow Data governance Data management Data pipelines Data quality Data warehouse Engineering ETL Jira JSON Kafka Kinesis Lambda Linux Machine Learning MySQL NLP NoSQL Open Source Oracle Pipelines PostgreSQL Python RDBMS Redshift Research Snowflake SQL Statistics Testing XML

Perks/benefits: Career development Health care Team events

Region: North America
Country: United States

More jobs like this