Senior Data Modeler- Contract

Toronto, Ontario, Canada

Benevity

Benevity's corporate purpose software offers the only integrated suite of community investment, employee, customer and nonprofit engagement solutions.

View all jobs at Benevity

Apply now Apply later

Meet Benevity

The world’s coolest companies (and their employees) use Benevity’s technology to take social action on the issues they care about. Through giving, volunteering, grantmaking, employee resource groups and micro-actions, we help most of the Fortune 100 brands build better cultures and use their power for good. We’re also one of the first B Corporations in Canada, meaning we’re as committed to purpose as we are to profits. We have people working all over the world, including Canada, Spain, Switzerland, the United Kingdom, the United States and more!

Benevity’s software architecture has evolved to include a diverse technology stack. The front-end application, using mainly VueJS, is designed for both desktop and mobile web rendering. Our back-end systems (some Java SpringBoot, some PHP) manage data processing, interfaces with external providers, and ensures robust security.  We run and operate our systems in the AWS cloud, leveraging where possible cloud-native technology. We emphasize clean, maintainable code and use GIT for version control and collaboration. Additionally, our platform integrates with various external services for functionalities like email communication, content storage, and server-to-server interactions.

Our culture is driven by our core value of “we-are-we” and as a Senior Data Modeler you will work in an outcome-driven environment where collaboration with your product, design and engineering counterparts is paramount.

If you’re eager to make a difference and thrive in a collaborative setting, we invite you to join our team!

What you’ll do:

The Senior Data Modeler will be responsible for the design, implementation, and maintenance of complex data models that support our business operations and analytics. This role requires a deep understanding of data modeling techniques, database design, and data management practices. The ideal candidate will be a strategic thinker with extensive experience in translating business requirements into robust data models.

  • Design and develop logical and physical data models to meet the needs of various business applications
  • Ensure data models are aligned with business requirements and best practices
  • Create and maintain data dictionaries and metadata repositories
  • Design and optimize database structures to support high-performance and scalability
  • Collaborate with database administrators to ensure optimal performance of data models
  • Implement indexing, partitioning, and other database optimization techniques
  • Work closely with data architects, data engineers, and business analysts to integrate data from various sources
  • Define data integration standards and practices. Ensure data consistency, quality, and integrity across different systems
  • Collaborate with cross-functional teams to understand business requirements and translate them into effective data models
  • Document data models, data flows, and business rules
  • Ensure compliance with data governance policies and industry regulations
  • Conduct regular audits and reviews of data models and databases to ensure compliance and optimal performance
  • Identify opportunities for process improvements and implement solutions to enhance data modeling practices

What you’ll bring:

  • Degree in Computer Science or equivalent professional experience 
  • 10 years of experience in data modeling, database design, and data management
  • 10 years of experience in data engineering, with a focus on data architecture, ETL processes, and big data technologies
  • 10  years of hands on experience with designing and deploying enterprise data warehouse models
  • 5+ years’ experience with proficiency in programming languages such as Python, Java, SQL 
  • Hands on experienced working with data warehousing technologies (BigQuery, Redshift, Snowflake)
  • Strong understanding of database management systems (SQL and NoSQL).
  • Expertise in big data technologies like Hadoop, Spark, DBT, Airflow, Apache Beam, Kafka, etc.
  • Experience with cloud-based platforms and building data engineering solutions (AWS, Azure or GCP)
  • Provide architectural guidance and big data engineering expertise for use cases that require capabilities in Federated Queries, Data Ingestion and Distributed Computing.
  • Excellent problem-solving skills and attention to detail.
  • Hands on experience with writing and optimizing SQL based code
  • Experience with database performance optimization and tuning. 
  • Familiarity with Continuous Integration/Continuous Deployment (CI/CD) Pipelines (e.g., Jenkins, GitLab CI/CD, AWS CodePipeline, GCP Cloud Build)
  • A strength in pragmatically designing, building and deploying scalable, highly-available systems
  • An ability to think abstractly and are comfortable with ambiguous/undefined problems
  • Excellent communication skills: you understand user needs and have the ability to translate them into actionable pieces of work

Great-to-haves:

  • GCP Professional Data Engineering certification
  • AWS Data Engineer certification
  • Azure Data Engineer certification
  • Experience with DBT, Databricks, Snowflakes, GCP/Azure/AWS

Discover your purpose at work

We are not employees, we are Benevity-ites. From all locations, backgrounds and walks of life, who deserve more …

Innovative work. Growth opportunities. Caring co-workers. And a chance to do work that fills us with a sense of purpose.

If the idea of working on tech that helps people do good in the world lights you up ... If you want a career where you’re valued for who you are and challenged to see who you can become …

It’s time to join Benevity. We’re so excited to meet you.

Where we work

At Benevity, we have developed a Community First approach that we design our people's experience around with goals to build a strong community and culture, achieve stellar execution of our business goals and social mandate, and ensure Benevity-ites thrive. For those who live within a reasonable commuting distance to an office, we can split our time working in the office and from home to optimize the opportunities of both, with the requirement that we spend at least 50% of the time in the office.

Join a company where DEIB isn’t a buzzword

The diverse backgrounds, experiences, skills and passions of our people make it possible for us to keep innovating as the market leader in our space.

Diversity, equity, inclusion and belonging are part of Benevity’s DNA. You’ll see the impact of our massive investment in DEIB daily — from our Black Employee Network making space for us to have difficult conversations to our Pride events and the exceptional diversity on our leadership and tech teams.

We strive to build a strong culture of belonging so that every Benevity-ite feels included and can thrive as their authentic selves — in a place where everyone has an equitable opportunity to shine!

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  2  0  0

Tags: Airflow Architecture AWS Azure Big Data BigQuery CI/CD Computer Science Databricks Data governance Data management Data warehouse Data Warehousing dbt Engineering ETL GCP Git GitLab Hadoop Java Jenkins Kafka NoSQL PHP Pipelines Python Redshift Security Snowflake Spark SQL Vue

Perks/benefits: Startup environment Team events

Region: North America
Country: Canada

More jobs like this