Senior Data Engineer, Risk Technology

Toronto - 16 York St

The Investment Management Corporation of Ontario

IMCO is an independent, long-term investor for Ontario's public-sector institutions.

View all jobs at The Investment Management Corporation of Ontario

Apply now Apply later

At IMCO, our talent is among the best!  IMCO offers a uniquely stimulating and rewarding environment where you can help build and drive organizational transformation, all while seeking to challenge yourself, learn, and grow your career. 

We offer a culture of collaboration and passion, creating unwavering value for the clients we serve.  Our vision is to be the partner of choice for Ontario’s public sector funds and build a high-performing, value-based Asset Management firm in the heart of downtown Toronto.  

If you are ready to drive best-in-class service, and join a collaborative, motivated, and fun team of professionals, we’re ready to offer you a great place to work with exciting opportunities for growth and development.

If you want to use your expertise to drive strategic business outcomes, then we want you at IMCO!
 

Risk Technology team is part of the Data Technology group at IMCO. The team works predominantly with Investment Risk group at IMCO to manage their technology and reporting needs in the form of small & medium sized initiatives, multi-year projects and providing them complete support in production processes.

As a key member of the Risk Technology team, the Senior Data Engineer will work closely with the Lead Data Engineer, Business System Analysts, and stakeholders from Investment Risk to design, build, and support the cloud-based infrastructure for the data pipeline, including maintenance, improvement, cleansing, and manipulation of data and analytics across IMCO’s data platforms and external Risk platforms. The Senior Data Engineer will also design and develop reports and analytical dashboards in Power BI.

The incumbent will report to Senior Manager, Data Technology.

Responsibilities:

  • Contribute to designing and implementation of data architecture and technology infrastructure for new data and analytics platform with Databricks and Snowflake data stack.

  • Build custom services and workflows to provide functionality when required to complement existing tools, i.e., convert to ingestible formats, call external web services, and third-party APIs.

  • Create database objects with a strong understanding of data modeling.

  • Write complex SQL for ETL and data extractions with high-performance and easy-to-manage code.

  • Develop and maintain automated build and deployment processes for all solutions using cloud tools and CI/CD practices.

  • Develop functional specifications and systems configuration documentation to support solution rollout.

  • Design, develop and maintain the tabular data model, reports and analytical dashboards in Power BI workspace.

  • Partner closely with business users to understand requirements and rapidly prototype reports to maximize end-user involvement and solicit feedback.

  • Work closely with business systems analysts SMEs on projects and operations to deliver effective support, investigate root causes, recommend changes, and maintain accurate documentation.

  • Reverse-engineer, investigate, and document data flow for operational procedures and support.

  • Evaluate existing data flow and operations, develop processes for effective maintenance, monitoring, and performance tuning, and recommend improvements for operational efficiency.

  • Enhance environment stability by working with others to set up and maintain production configurations.  Continuously improving service levels with all business and technical stakeholders.

  • Automate manual processes, optimize data delivery, and redesign infrastructure for greater efficiency and scalability.

  • Provide health status of operations within the platform with specific database and ETL processes support.

  • Initiate knowledge dissemination of design, coding, and analysis techniques in a collaborative manner.

  • Facilitate testing of solutions and production implementation planning while performing as a trusted liaison between IT, Investment Risk stakeholders, and external solution providers.

  • Explore innovative solutions to leverage Databricks data & analytics services to solve complex technical and business problems with a forward-thinking approach.

  • Incorporate operational risk and systems performance management in all initiatives.

  • Support BI solutions with Data warehouses.

What you need to succeed:


Know-How

  • Minimum 7 years of experience in data warehousing, designing complex SQL queries, tables, views, and procedures, including performance tuning and query optimization.

  • Undergraduate degree in Computer Science or Engineering or other related discipline with experience building and maintaining IT data operations ideally in asset management or investment.

  • Experience in ETL processing using tools Azure Data Factory and Airflow.

  • Intermediate to senior-level experience in operationalizing data platforms with Azure data solutions with hands-on experience in Azure Databricks and Snowflake.  Experience in Synapse is required to migrate the code to Databricks.

  • Solid understanding of dimensional data models.

  • Hands-on experience with source control management systems and continuous integration/deployment.

  • Design and develop BI reports and analytical dashboards.

  • Design and maintain a tabular data model in Power BI.

  • Power BI administration through the Power BI Portal.

  • Deep knowledge of DAX syntax for tabular data models and Power BI calculations.

  • Experience with Azure DevOps for code repository and building pipelines.

  • Proficient in Python development in the PySpark environment.

  • Track record to work in an exploratory capacity to innovate, benchmark, and make recommendations to improve the efficiency and effectiveness of design and data operations.

  • Ability to self-direct, manage priorities, and meet deadlines.

  • Superior tactical, analytical, evaluative, and problem-solving abilities to translate business requirements into technical specifications.

  • Strong domain knowledge in IT service management, including incident management, change management, configuration management, and operations management.

  • Good understanding of agile practices, with demonstrated experience in operating in Kanban or Scrum for delivery.

Desirable:

  • Development experience in using .Net C#.

  • Development experience in Azure Function Apps.

  • Exposure to data science, machine learning, and LLM is an asset.

  • Development experience with Databricks and Snowflake is a plus.

  • Having worked in the Investment Management, Capital Markets space.

We thank all applicants, however, only those selected for an interview will be contacted.

Our hybrid work model offers flexibility and provides our employees with the opportunity to lead a well-balanced life. Our Corporation’s offices located at 16 York Street, Suite 2400, Toronto, ON M5J 0E6 provide a welcoming space for employees to gather, work collaboratively, and grow together.

IMCO is committed to providing accommodation for people with disabilities in its recruitment process. Please advise IMCO if you require an accommodation and we will work with you to meet your needs. Candidates being considered for this position will be required to submit to a background screening.

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow APIs Architecture Azure CI/CD Computer Science Databricks DataOps Data Warehousing DevOps Engineering ETL Kanban LLMs Machine Learning Pipelines Power BI PySpark Python Scrum Snowflake SQL Testing

Perks/benefits: Career development

Region: North America
Country: Canada

More jobs like this