Data Engineer

Calgary, Alberta, Canada; Toronto, Ontario, Canada; Vancouver, British Columbia, Canada

Benevity

Benevity's corporate purpose software offers the only integrated suite of community investment, employee, customer and nonprofit engagement solutions.

View all jobs at Benevity

Apply now Apply later

Meet Benevity

Benevity is the way the world does good, providing companies (and their employees) with technology to take social action on the issues they care about. Through giving, volunteering, grantmaking, employee resource groups and micro-actions, we help most of the Fortune 100 brands build better cultures and use their power for good. We’re also one of the first B Corporations in Canada, meaning we’re as committed to purpose as we are to profits. We have people working all over the world, including Canada, Spain, Switzerland, the United Kingdom, the United States and more!

As a Data Engineer, you will be a key member of our data team, responsible for designing, building, and maintaining scalable and efficient data pipelines. You will work closely with data scientists, analysts, and other teams to ensure the organization’s data infrastructure is optimized to meet the needs of business analytics, machine learning, and reporting. Your expertise in managing data workflows, ETL (Extract, Transform, Load) processes, and integrating various data sources will be vital in supporting data-driven decision-making across the organization.

If you’re eager to make a difference and thrive in a collaborative setting, we invite you to join our team!

What you’ll do:

  • Data Pipeline Development: Design, build, and maintain robust and scalable data pipelines that automate data extraction, transformation, and loading from various sources (databases, APIs, flat files, etc.).
  • Data Integration: Integrate disparate data sources into a unified, accessible format for analytics and reporting purposes.
  • Data Modeling: Develop data models, data warehousing solutions, and implement best practices for structuring and storing large volumes of data.
  • ETL Process Management: Develop and optimize ETL processes to handle high-throughput, real-time data streams and batch processing.
  • Performance Optimization: Monitor, optimize, and troubleshoot data processing workflows for improved speed, efficiency, and scalability.
  • Collaboration: Work with cross-functional teams, including Data Scientists, Analysts, and Business Intelligence teams, to deliver high-quality data solutions.
  • Data Quality Assurance: Ensure data quality, integrity, and compliance with industry standards and best practices.
  • Cloud and Big Data Technologies: Manage cloud-based data storage solutions and big data processing frameworks to enable real-time and batch analytics at scale.

What you’ll bring:

  • Bachelor’s or Master’s degree in Computer Science or equivalent professional experience
  • 4+ years of professional experience in data engineering or a related field.
  • Solid understanding of data modeling, ETL processes, and data warehousing.
  • Experience working with large-scale data infrastructure, including batch and stream processing systems.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and teamwork abilities.
  • Ability to manage multiple projects and priorities in a fast-paced environment.
  • Passion for continuous learning and keeping up with the latest data engineering trends

Preferred Skills and Tools:

  • Programming Languages: Proficient in Python, Java, or Scala for data processing and automation.
  • Big Data Technologies: Familiarity with Hadoop, Spark, Kafka, or similar distributed data processing frameworks.
  • ETL Tools: Experience with ETL tools like Apache NiFi, Talend, or Informatica.
  • Databases: Knowledge of SQL and NoSQL databases such as PostgreSQL, MySQL, MongoDB, or Cassandra.
  • Cloud Platforms: Proficiency with cloud services such as AWS (Redshift, S3, Lambda), Google Cloud (BigQuery, Dataflow), or Azure (Data Lake, SQL Data Warehouse).
  • Data Warehousing Solutions: Experience with data warehousing platforms like Snowflake, Amazon Redshift, or Google BigQuery.
  • Data Orchestration Tools: Familiarity with Apache Airflow, Prefect, or similar tools for workflow automation and scheduling.
  • Version Control: Experience with Git for source code management and collaboration.

Great-to-haves:

  • AWS Certified Data Analytics - Specialty
  • Google Professional Data Engineer
  • Microsoft Certified: Azure Data Engineer Associate
  • Cloudera Certified Associate (CCA) Data Analyst
  • Certified Data Management Professional (CDMP)

Discover your purpose at work

We’re not employees, we’re Benevity-ites. From all locations, backgrounds and walks of life, who deserve more …

Innovative work. Growth opportunities. Caring co-workers. And a chance to do work that fills us with a sense of purpose.

If the idea of working on tech that helps people do good in the world lights you up ... If you want a career where you’re valued for who you are and challenged to see who you can become …

It’s time to join Benevity. We’re so excited to meet you.

Where we work

At Benevity, we embrace a flexible hybrid approach to where we work that empowers our people in a way that supports great work, strong relationships, and personal well-being. For those located near one of our offices, while there’s no set requirement for in-office time, we do value the moments when coming together in person helps us build connection and collaboration. Whether it’s for onboarding, project work, or a chance to align and bond as a team, we trust our people to make thoughtful decisions about when showing up in person matters most.

Join a company where DEIB isn’t a buzzword

Diversity, equity, inclusion and belonging are part of Benevity’s DNA. You’ll see the impact of our massive investment in DEIB daily — from our well-supported employee resources groups to the exceptional diversity on our leadership and tech teams.

We know that diverse backgrounds, experiences, skills and passions are what move our business and our people forward, so we're committed to creating a culture of belonging with equal opportunities for everyone to shine. 

That starts with a fair and accessible hiring process. If you want to feel seen, heard and celebrated, you belong at Benevity.

Candidates with disabilities who may require accommodations throughout the hiring or assessment process are encouraged to reach out to accommodations@benevity.com.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  3  1  0
Category: Engineering Jobs

Tags: Airflow APIs AWS Azure Big Data BigQuery Business Analytics Business Intelligence Cassandra Computer Science Data Analytics Dataflow Data management Data pipelines Data quality Data warehouse Data Warehousing Engineering ETL GCP Git Google Cloud Hadoop Informatica Java Kafka Lambda Machine Learning MongoDB MySQL NiFi NoSQL Pipelines PostgreSQL Python Redshift Scala Snowflake Spark SQL Talend

Perks/benefits: Career development Flex hours Startup environment

Region: North America
Country: Canada

More jobs like this