Data Engineer

Chicago, IL

Crafty

Centralize your corporate food delivery services for office snacks, coffee, drinks, and equipment while maximizing every dollar inside the Crafty Platform.

View all jobs at Crafty

Apply now Apply later

Note to Candidates:

We meticulously craft our job descriptions to clearly define the skills necessary for success and the expected performance level of the role, resulting in detailed content. Our goal is to ensure the description accurately reflects the job you will perform. We value your input and look forward to hearing what you think about the format in the interview process.

This is a hybrid role (three days per week at our HQ in Chicago, IL) and will report directly to the Principal Data Analyst. 

Who We Are

Crafty elevates workplace food and beverage programs with enhanced services managed in one innovative, centralized platform. Founded in 2015, our mission is to help companies craft better workplaces. From DraftKings to Robinhood to Zillow, we work with the world’s biggest brands to foster a culture of employee connectivity and productivity. Headquartered in Chicago, with offices in New York and the Bay Area, Crafty manages food and beverage programs across 300+ offices, serving more than 300,000 employees per month.

Our commitment to crafting better workplaces starts from within. We are a team of passionate, resourceful, and hard-working trailblazers who love what we do. Our expertise spans technology, food and beverage operations, client success, fulfillment and more. At Crafty, our people are our greatest asset, because it's our people who foster a culture that makes our company a place worth being part of. And of course, the snacks are the cherry on top!

The Role

As Crafty’s first Data Engineer, you will play a key role in building and maintaining reliable, efficient, and secure data pipelines that power the organization’s analytical and operational needs. This role is ideal for someone who thrives at the intersection of engineering excellence, problem-solving, and strategic thinking. You’ll work to deliver scalable data infrastructure that supports both day-to-day decision-making and long-term business goals.

You’ll be proactive in identifying missing tooling, data sources, documentation/testing, or opportunities to improve existing processes. A strong understanding of the underlying business logic behind data transformations will be essential. 

While the primary focus will be infrastructure development, you’ll also step in occasionally to support data requests or report development, helping ensure the business has what it needs to make informed decisions.

Required Attributes

  • Strong Data Engineering Fundamentals – You have hands-on experience designing and maintaining data pipelines, with the ability to troubleshoot data issues and discrepancies. You’re comfortable working with cloud-based data tools (e.g. Snowflake, dbt, stitch) and understand best practices around schema design, pipeline structure, testing, and data flow monitoring. You write clean, maintainable code and think about long-term sustainability of the data infrastructure, in addition to ensuring security.
  • SQL Expertise & Performance Optimization – You are highly proficient in SQL and understand how to write performant queries that scale across large datasets. You can analyze query plans, reduce compute costs, and structure transformations to minimize unnecessary complexity. You enjoy working with analysts and stakeholders to refine or optimize queries, and have a strong intuition for trade-offs between speed, cost, and clarity. 
  • Business-Oriented Mindset – You are not just technically skilled - you care about what the data means. You make an effort to understand the business logic behind the raw data and transformations, ensuring that data models reflect the business accurately. 
  • Initiative & Cross-Functional Collaboration – You take ownership of problems without being asked. You can identify gaps in tooling, missing data sources, or brittle processes and work towards fixing them. You’re comfortable jumping into collaborative conversations and can work together on solutions. 
  • Curious & Impact-Driven – You approach problems with curiosity and a mindset of continuous improvement. You’re energized by solving real business problems and enjoy finding opportunities where data can drive value. You're open to exploring new tools or methodologies that can improve impact or efficiency.
  • Adaptable & Solution-Oriented – Comfortable working in a fast-paced, evolving environment. When faced with ambiguity or change, you focus on what you can control and proactively seek out solutions. You stay focused on outcomes and collaborate well across teams to help keep work moving forward.

Ideal Experience

  • 2-3 years of experience working in data engineering related roles
  • Strong Technical background with Postgres, Snowflake, ETLs/ELTs, DBT, SQL, Python
  • Understanding system infrastructure and design

Don’t meet all of the qualifications? We want you to consider all of your skills and experiences - both professional and personal- that would make you successful in this role. Although some qualifications are essential, others can be attained with time. We believe diverse perspectives, upbringings, and knowledge contribute to our strong company culture and we encourage you to apply.

Role Goals

  • Goal #1:  By the end of Q3, automate 50% of our manual data pull processes by utilizing current tools or implementing new tools to reach this goal.  

What we offer:

Our people mean everything to us. When you join Crafty, you’re joining a team of passionate, smart, and supportive people who work incredibly hard and have a good time along the way.

We are proud to offer a compensation package that includes our Crafty healthcare plan, covering primary health, dental, and vision plans, an automatic 4% 401k contribution, paid time off, equipment certification courses, and parental leave. And, of course, it also includes Crafty-grade snacks, beverages, and fun events!

Lastly, this role offers a special opportunity: to have a major hand in shaping the future of a young, flourishing company. Your creativity, ambition, and work will steer the direction of our successes.

Our compensation amount for this role is targeted at $100,000 - $125,000 per year in Chicago. Final offer amounts are determined by multiple factors including cost of living based on location, candidate experience and expertise, and may vary from the amounts listed above.

Crafty provides equal employment opportunities (EEO) to all employees and applicants for employment without discriminating against race, color, religion, sex, sexual orientation, national origin, age, disability or genetics.

Apply now Apply later
Job stats:  0  0  0
Category: Engineering Jobs

Tags: Data pipelines dbt Engineering ETL Pipelines PostgreSQL Python Security Snowflake SQL Testing

Perks/benefits: 401(k) matching Career development Health care Parental leave Snacks / Drinks Startup environment Team events

Region: North America
Country: United States

More jobs like this