Senior Manager, Data Enablement & Business Intelligence
Toronto
Equitable Bank
At Equitable Bank, we specialize in providing branchless financial services that meet the unique needs of all Canadians. Our range of mortgages, savings accounts and investment options are designed to offer the right solutions to match any...As a key member of our Insights & Analytics team, you’ll be the architect behind the dashboards, data pipelines, and tooling that power real-time decision-making across EQ Bank. You’ll work as an individual contributor while partnering with senior leaders, data scientists, and engineers to unlock the full potential of our data ecosystem—driving smarter strategies, faster execution, and measurable impact.
This is more than a BI role. It’s a chance to shape the future of data at one of Canada’s most innovative digital banks
Key Responsibilities:
- 1. Product & Executive Dashboards + Self-Serve Analytics (60%) · Define and execute a bold dashboarding strategy that empowers teams across Product, Strategy, and Marketing.· Build dynamic, scalable dashboards that deliver real-time insights and drive autonomous decision-making.· Lead deep-dive analyses using SQL and Python to uncover trends, diagnose issues, and surface opportunities.· Champion self-serve analytics by automating workflows and reducing reliance on centralized data teams.· Design and deploy temporary ETL/ELT pipelines to support agile product experimentation and rapid insight delivery.
- 2. Data Strategy & Enablement (30%) · Partner with the Head of Insights & Analytics to shape EQ Bank’s data acquisition roadmap.· Collaborate with Tech & Engineering to ensure seamless data integration, quality, and compliance.· Drive the migration to Azure Fabric, ensuring scalable, reliable access to analytics-ready data.· Proactively identify and resolve data availability gaps to ensure 24/7 access for analytics teams
- 3. Data Tooling & Infrastructure (10%) · Lead the development of a modern data tooling strategy in collaboration with Enterprise Data and Cloud Ops.· Build business cases for new tools that enhance product development, data access, and operational efficiency.· Own and optimize real-time data pipelines for marketing and communications—ensuring they’re fast, resilient, and future-proof
Knowledge/Skill Requirements:
- Bachelor’s or Master’s degree in computer science, Engineering, Data Science, Business Analytics, or a related field.
- 8+ years of hands-on experience in business intelligence, analytics, or data engineering, with a focus on data systems, dashboarding, and product analytics.
- Proven track record of building and scaling data pipelines, APIs, and real-time analytics solutions in cloud environments, particularly Azure.
- Experience in leading cross-functional teams and collaborating with Product, Strategy, Marketing, and Engineering teams to deliver data-driven strategies and solutions.
- Expertise in SQL and experience with advanced data modeling and data manipulation techniques.
- Technical Expertise: · Strong proficiency in Python or Scala, with hands-on experience in building distributed systems and complex data pipelines to support high-volume, real-time data workflows.· Proven experience in architecting and implementing robust, scalable data pipelines for data extraction, transformation, and loading (ETL/ELT) workflows. Ability to design, develop, and deploy APIs to integrate and automate data flows across systems.· In-depth knowledge of building scalable systems capable of handling large datasets and high transaction volumes. Strong experience in system architecture, ensuring high availability, performance, and reliability of data infrastructure.· Must have hands-on experience with Azure-based tools like Azure Data Factory, Power BI, and Azure ML, enabling the design and implementation of cloud-based data solutions. Familiarity with Azure Databricks or similar platforms (e.g., Snowflake) is highly desirable.· Deep understanding and hands-on experience with modern data warehousing solutions (e.g., Snowflake, BigQuery, Redshift), including data modeling, data lake integration, and performance optimization.· Strong experience working with big data technologies such as Spark, Hadoop, and distributed processing tools, optimizing data processing and query performance.· Expertise in advanced SQL for complex data manipulation, optimization, and data modeling. Proficiency in writing efficient queries for large datasets, ensuring data quality, and applying advanced techniques like window functions, subqueries, and indexing for performance optimization.· Experience with tools like dbt, Airflow, Prefect, and Fivetran to manage data workflows and automate tasks.· Familiarity with RESTful API design for integrating various data sources and ensuring seamless flow across data pipelines.· Strong experience in building interactive and insightful data visualizations using tools like Power BI, Tableau, or custom web-based solutions.· Proficiency in JavaScript for implementing custom visualizations, building dynamic dashboards, and interacting with APIs in the front-end environment.· Knowledge and hands-on experience with messaging queues (e.g., Kafka, RabbitMQ, SQS) for ensuring reliable communication and data processing in distributed systems. Ability to manage real-time data ingestion and event-driven architectures effectively.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow APIs Architecture Azure Banking Big Data BigQuery Business Analytics Business Intelligence Computer Science Databricks Data pipelines Data quality Data strategy Data Warehousing dbt Distributed Systems ELT Engineering ETL FiveTran Hadoop JavaScript Kafka Machine Learning Pipelines Power BI Python RabbitMQ Redshift Scala Snowflake Spark SQL Tableau
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.