Data Engineer

Solon, OH, United States

āš ļø We'll shut down after Aug 1st - try foošŸ¦ for all jobs in tech āš ļø

Apply now Apply later

West Star is the fastest growing maintenance repair organization in the industry and we recognize this is the result of our talented team of trusted employees working together to deliver customer service excellence. We are committed to providing our employees with personal and professional growth opportunities while fostering a culture of respect and well-being with a small company feel.

When you join our team we don’t think you should have to wait for your benefits to kick in. That’s why when you start, they start with you! This includes medical, dental, 401K match, time off accruals, weekly pay days and much more. We don’t want you to live to work, we want you to work and live.

What you can expect as an Data Engineer at West Star:

We are seeking a highly skilled and experienced Microsoft Fabric Data Engineer to lead the design and development of our next-generation data platform built on Microsoft Fabric. You’ll work closely with data architects, analysts, and business stakeholders to modernize our data pipelines, implement scalable lakehouse patterns, and support AI/analytics initiatives across the organization.

This is a hands-on engineering role for someone deeply familiar with Microsoft’s modern data stack — including OneLake, Lakehouses, Data Pipelines, Notebooks, and integration with Power BI. This role is open to candidates in Cleveland, OH, or St. Louis, MO.

You will be ESSENTIAL to many FUNCTIONS including:

  • Design and implement data architectures using Microsoft Fabric components (Lakehouse, Warehouse, Pipelines, Notebooks).
  • Build and manage scalable data pipelines for ingesting and transforming structured and unstructured data from internal and external sources.
  • Optimize storage and performance of datasets in OneLake and Delta Lake.
  • Work closely with the Data Analyst team to build and refine data models in Fabric Warehouse and support semantic modeling for reporting.
  • Assist in implementing governance, monitoring, and security best practices across Fabric environments.
  • Partner with AI/ML teams to deliver clean, structured, and high-performance datasets.
  • Contribute to the ongoing evolution of the Fabric platform, including architecture reviews and strategic planning
  • Work closely with IT to optimize data architecture for application reporting.
  • Effectively and clearly communicate (i.e., speak, write, read) in English

Any other job-related duties as assigned by supervisor or management

Qualifications

What you’ll need to bring with you:

Your Education:

A High school diploma or equivalent.

Microsoft Certified: Fabric Analytics Engineer Associate or equivalent certification

A valid driver’s license approved for airline travel and/or a valid passport is ideal, but not mandatory.

Your Experience :

5+ years of experience in data engineering or analytics engineering roles.

Deep hands-on experience with Microsoft Fabric components, including Lakehouse, Data Factory (Pipelines), OneLake, Notebooks, Data bricks, Azure Synapse, and Warehouse.

Proficiency in T-SQL and working knowledge of PySpark or KQL.

Solid understanding of data lakehouse architecture, including Delta Lake, parquet, and structured streaming concepts.

Experience integrating with Power BI, including support for semantic models, Direct Lake, and dataset optimization.

Working knowledge of Azure Data Services such as Azure Data Lake Storage Gen2, Azure SQL, and Azure Synapse.

Familiarity with CI/CD practices and DevOps for data pipelines

Experience building semantic models in Power BI, including DAX and tabular model optimization

Experience with data governance tools such as Microsoft Purview or Unity Catalog

Experience with legacy-to-Fabric modernization, including migration from dataflows Gen1

Proficiency working with Delta Lake, parquet, or similar modern analytics-optimized formats

Your Initiative:

We’re looking for team players who are self-motivated and able to perform in a fast paced environment where working under specific deadlines and time constraints will be common. This person will need strong analytical and communication skills, with the ability to translate business requirements into scalable data solutions.

Your Sense of Responsibility:

Attend work every day as scheduled

Notify supervisor in advance of shift starting if unable to work.

Must have reliable transportation to get to work each and every day.

Follow all company and safety rules during performance of duties

Maintain customer oriented work habits

Other particulars:

Physical Requirements

Lift and carry up to 10 lbs

Routine walking, bending, stooping and sitting.

Sit at a desk and/or computer for extended intervals.

Routine or repetitive physical motion with arms and hands.

Mental Requirements

Work with others in a professional manner.

Understanding and implementation of regulations and guidelines.

Prioritize workload and work under pressure

Coordinate multiple projects and duties

Supervision

Work under minimal supervision

Work with other Managers coordinating projects in a cooperative manner

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index šŸ’°

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture Azure CI/CD Databricks Data governance Data pipelines DevOps Driver’s license Engineering Machine Learning Parquet Pipelines Power BI PySpark Security SQL Streaming T-SQL Unstructured data

Perks/benefits: 401(k) matching Career development Startup environment

Region: North America
Country: United States

More jobs like this