Databricks Data Architect, Deloitte Global Technology
Toronto, Ontario, Canada
Full Time Senior-level / Expert USD 85K - 156K
Deloitte
Insights zu unseren Diensleistungen im Bereich Audit, Consulting, Financial Advisory, Risk Adivisory und Tax sowie unseren zahlreichen Industrien.Job Type: Permanent
Work Model: Remote
Reference code: 129248
Primary Location: Toronto, ON
All Available Locations: Toronto, ON
Our Purpose
At Deloitte, our Purpose is to make an impact that matters. We exist to inspire and help our people, organizations, communities, and countries to thrive by building a better future. Our work underpins a prosperous society where people can find meaning and opportunity. It builds consumer and business confidence, empowers organizations to find imaginative ways of deploying capital, enables fair, trusted, and functioning social and economic institutions, and allows our friends, families, and communities to enjoy the quality of life that comes with a sustainable future. And as the largest 100% Canadian-owned and operated professional services firm in our country, we are proud to work alongside our clients to make a positive impact for all Canadians.
By living our Purpose, we will make an impact that matters.
- Have many careers in one Firm.
- Enjoy flexible, proactive, and practical benefits that foster a culture of well-being and connectedness.
- Learn from deep subject matter experts through mentoring and on the job coaching
--
Deloitte Global is the engine of the Deloitte network. Our professionals reach across disciplines and borders to develop and lead global initiatives. We deliver strategic programs and services that unite our organization.
What will your typical day look like?
Job Summary
The Databricks Data Architect is a senior technical leader responsible for building and optimizing a robust data platform in a financial services environment. In this full-time role, you will lead a team of 10+ data engineers and own the end-to-end architecture and implementation of the Databricks Lakehouse platform. You will collaborate closely with application development and analytics teams to design scalable data solutions that drive business insights. This position demands deep expertise in Databricks (Azure), hands-on experience with PySpark and Delta Lake, and strong leadership to ensure best practices in data engineering, performance tuning, and governance.
Key Responsibilities
- Lead, mentor, and manage a team of 10+ data engineers, providing technical guidance, code reviews, and career development to foster a high-performing team.
- Own the Databricks platform architecture and implementation, ensuring the environment is secure, scalable, and optimized for the organization’s data processing needs. Design and oversee the Lakehouse architecture leveraging Delta Lake and Apache Spark.
- Implement and manage Databricks Unity Catalog for unified data governance. Ensure fine-grained access controls and data lineage tracking are in place to secure sensitive financial data and comply with industry regulations.
- Provision and administer Databricks clusters (in Azure), including configuring cluster sizes, auto-scaling, and auto-termination settings. Set up and enforce cluster policies to standardize configurations, optimize resource usage, and control costs across different teams and projects.
- Collaborate with analytics teams to develop and optimize Databricks SQL queries and dashboards. Tune SQL workloads and caching strategies for faster performance and ensure efficient use of the query engine.
- Lead performance tuning initiatives for Spark jobs and ETL pipelines. Profile data processing code (PySpark/Scala) to identify bottlenecks and refactor for improved throughput and lower latency. Implement best practices for incremental data processing with Delta Lake, and ensure compute cost efficiency (e.g., by optimizing cluster utilization and job scheduling).
- Work closely with application developers, data analysts, and data scientists to understand requirements and translate them into robust data pipelines and solutions. Ensure that data architectures support analytics, reporting, and machine learning use cases effectively.
- Integrate Databricks workflows into the CI/CD pipeline using Azure DevOps and Git. Develop automated deployment processes for notebooks, jobs, and clusters (infrastructure-as-code) to promote consistent releases. Manage source control for Databricks code (using Git integration) and collaborate with DevOps engineers to implement continuous integration and delivery for data projects.
- Collaborate with security and compliance teams to uphold data governance standards. Implement data masking, encryption, and audit logging as needed, leveraging Unity Catalog and Azure security features to protect sensitive financial data.
- Stay up-to-date with the latest Databricks features and industry best practices. Proactively recommend and implement improvements (such as new performance optimization techniques or cost-saving configurations) to continuously enhance the platform’s reliability and efficiency.
About the team
Enough about us, let’s talk about you
Qualifications
- Bachelor’s degree in Computer Science, Information Systems, or a related field
- 7+ years of experience in data engineering, data architecture, or related roles, with a track record of designing and deploying data pipelines and platforms at scale.
- Significant hands-on experience with Databricks (preferably Azure Databricks) and the Apache Spark ecosystem. Proficient in building data pipelines using PySpark/Scala and managing data in Delta Lake format.
- Strong experience working with cloud data platforms (Azure preferred, or AWS/GCP). Familiarity with Azure data services (such as Azure Data Lake Storage, Azure Blob Storage, etc.) and managing resources in an Azure environment.
- Advanced SQL skills with the ability to write and optimize complex queries. Solid understanding of data warehousing concepts and performance tuning for SQL engines.
- Proven ability to optimize ETL jobs and Spark processes for performance and cost efficiency. Experience tuning cluster configurations, parallelism, and caching to improve job runtimes and resource utilization.
- Demonstrated experience implementing data security and governance measures. Comfortable configuring Unity Catalog or similar data catalog tools to manage schemas, tables, and fine-grained access controls. Able to ensure compliance with data security standards and manage user/group access to data assets.
- Experience leading and mentoring engineering teams. Excellent project leadership abilities to coordinate multiple projects and priorities. Strong communication skills to effectively collaborate with cross-functional teams and present architectural plans or results to stakeholders.
Preferred
- Databricks Certified Data Engineer Professional or Databricks Certified Data Engineer Associate. Equivalent certifications in cloud data engineering or architecture (e.g., Azure Data Engineer, Azure Solutions Architect)
- Prior experience in the financial services industry or other highly regulated industries. Familiarity with financial data types, privacy regulations, and compliance requirements (e.g. handling PII, PCI data) can be beneficial.
- Exposure to related big data and streaming tools such as Apache Kafka/Event Hubs, Apache Airflow or Azure Data Factory for orchestration, and BI/analytics tools (e.g., Power BI) is advantageous.
- Experience implementing CI/CD pipelines for data projects. Familiarity with Databricks Repos, Jenkins, or other CI tools for automated testing and deployment of data pipelines.
Tools & Technologies
- Databricks Lakehouse Platform: Databricks Workspace, Apache Spark, Delta Lake, Databricks SQL, MLflow (for model tracking).
- Data Governance: Databricks Unity Catalog for data cataloging and access control; Azure Active Directory integration for identity management.
- Programming & Data Processing: PySpark and Python for building data pipelines and Spark Jobs; SQL for querying and analytics;
- Cloud Services (Azure-focused): Azure Databricks, Azure Data Lake Storage (ADLS Gen2), Azure Blob Storage, Azure Synapse or SQL Database, Azure Key Vault (for secrets).
- DevOps & CI/CD: Azure DevOps (Azure Pipelines) for build/release pipelines, Git for version control (GitHub or Azure Repos); experience with Terraform or ARM templates for infrastructure-as-code is a plus.
- Other Tools: Project and workflow management tools (JIRA or Azure Boards), monitoring tools (Azure Log Analytics, Spark UI or Databricks performance monitoring), and collaboration tools for documentation and design (Figma, Visio, Lucidcharts etc.).
Total Rewards
The salary range for this position is $85,000 - $156,000, and individuals may be eligible to participate in our bonus program. Deloitte is fair and competitive when it comes to the salaries of our people. We regularly benchmark across a variety of positions, industries, sectors, targets, and levels. Our approach is grounded on recognizing people's unique strengths and contributions and rewarding the value that they deliver.
Our Total Rewards Package extends well beyond traditional compensation and benefit programs and is designed to recognize employee contributions, encourage personal wellness, and support firm growth. Along with a competitive base salary and variable pay opportunities, we offer a wide array of initiatives that differentiate us as a people-first organization. On top of our regular paid vacation days, some examples include: $4,000 per year for mental health support benefits, a $1,300 flexible benefit spending account, firm-wide closures known as "Deloitte Days", dedicated days of for learning (known as Development and Innovation Days), flexible work arrangements and a hybrid work structure.
Our promise to our people: Deloitte is where potential comes to life.
Be yourself, and more.
We are a group of talented people who want to learn, gain experience, and develop skills. Wherever you are in your career, we want you to advance.
You shape how we make impact.
Diverse perspectives and life experiences make us better. Whoever you are and wherever you’re from, we want you to feel like you belong here. We provide flexible working options to support you and how you can contribute.
Be the leader you want to be
Some guide teams, some change culture, some build essential expertise. We offer opportunities and experiences that support your continuing growth as a leader.
Have as many careers as you want.
We are uniquely able to offer you new challenges and roles – and prepare you for them. We bring together people with unique experiences and talents, and we are the place to develop a lasting network of friends, peers, and mentors.
The next step is yours
At Deloitte, we are all about doing business inclusively – that starts with having diverse colleagues of all abilities. Deloitte encourages applications from all qualified candidates who represent the full diversity of communities across Canada. This includes, but is not limited to, people with disabilities, candidates from Indigenous communities, and candidates from the Black community in support of living our values, creating a culture of Diversity Equity and Inclusion and our commitment to our AccessAbility Action Plan, Reconciliation Action Plan and the BlackNorth Initiative.
We encourage you to connect with us at accessiblecareers@deloitte.ca if you require an accommodation for the recruitment process (including alternate formats of materials, accessible meeting rooms or other accommodations) or indigenouscareers@deloitte.ca for any questions relating to careers for Indigenous peoples at Deloitte (First Nations, Inuit, Métis).
By applying to this job you will be assessed against the Deloitte Global Talent Standards. We’ve designed these standards to provide our clients with a consistent and exceptional Deloitte experience globally.
Deloitte Canada has 20 offices with representation across most of the country. We acknowledge that Deloitte offices stand on traditional, treaty, and unceded territories in what is now known as Canada. We recognize that Indigenous Peoples have been the caretakers of this land since time immemorial, nurturing its resources and preserving its natural beauty. We acknowledge this land is still home to many First Nations, Inuit, and Métis Peoples, who continue to maintain their deep connection to the land and its sacred teachings. We humbly acknowledge that we are all Treaty people, and we commit to fostering a relationship of respect, collaboration, and stewardship with Indigenous communities in our shared goal of reconciliation and environmental sustainability.
Tags: Airflow Architecture AWS Azure Big Data CI/CD Computer Science Databricks Data governance Data pipelines Data Warehousing DevOps Engineering ETL GCP Git GitHub Jenkins Jira Kafka Machine Learning MLFlow Pipelines Power BI Privacy PySpark Python Scala Security Spark SQL Streaming Terraform Testing
Perks/benefits: Career development Competitive pay Equity / stock options Flex hours Flexible spending account Flex vacation Health care Salary bonus Wellness
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.