Data Engineer (term)
Toronto, ON, CA
University of Toronto
The University of Toronto is a globally top-ranked public research university in Toronto, Ontario, Canada.Date Posted: 10/01/2024
Req ID: 39886
Faculty/Division: Operations and Real Estate Parternships
Department: Inst Research and Data Governance
Campus: St. George (Downtown Toronto)
Position Number: 00045541
Description:
About us:
The University’s Institutional Research and Data Governance (IRDG) office provides leadership and administrative support for the institution’s data strategy. The IRDG office’s mandate is to:
- Provide reporting and analytics services within our distributed institutional research model: lead institution-wide analysis projects, provide training opportunities for divisional analysts, implement new analytics tools and platforms, and improve access to curated institutional datasets
- Support external reporting of institutional data (SMA metrics, key performance indicators for governance, international rankings, data exchanges with peer institutions)
- Implement an institution-widedata governance program to support the University in strategically harnessing data to achieve institutional goals
- Provide leadership and administrative support for the institution’s data strategy
Your opportunity:
Under the direct supervision of the Manager, Data Engineering and business Intelligence, the incumbent will be responsible for designing and developing data pipelines, data marts, and reporting solutions using cloud platforms, such as Microsoft Azure Data Factory. Ongoing oversight and change management of pipelines, and the integration of appropriate governance and best practices is required. This role involves working on data migration projects, integrating various data sources, and developing solutions that align with data models. Strong expertise in SQL is essential for building and optimizing queries. Additionally, the incumbent will be expected to ensure efficient data integration and contribute to the overall success of data-driven initiatives.
Your responsibilities will include:
- Writing complex technical code
- Designing, testing, and modifying programming code
- Analyzing and writing programming code structures based onuser requirements
- Evaluating programming code to ensure it has validity, compatibility, and that it meets appropriate standards
- Developing technical application implementation plans
- Commenting on programming code for the purposes of standardization and consistency
- Creating complex and technical documentation and user support guides
Essential Qualifications:
- Bachelor's Degree in Computer Science, Applied Mathematics, Engineeringor acceptable combination of equivalent experience.
- Minimum four years working experience as a database ETL developer.
- Experience with data warehouse type projects.
- Demonstrate experience working with large and complex data sets as well as experience analysing volumes of data.
- Experience in the creation and debugging of databases.
- Strong working and conceptual knowledge and experience of building and maintaining physical and logical data models.
- Experience and familiarity with Microsoft Azure services like Azure Data Lake, Azure SQL, and Azure Data Lake
- Experience in in automating workflows and monitoring data processes.
- Experience in building and managing data pipelines using Azure DataFactory.
- Experience with data transformation tools and techniques (e.g., mapping data flows).
- Exceptional analytical skills, showing fluency in the use of tools such as MySQL and strong Python, Shell, Java, PHP, Informatica ETL language SQL programming skills.
- Ability to design, build, and maintain the business’s ETL pipeline for data warehouse.
- Demonstrate expertise in data modelling and query performance tuning on SQL Server, MySQL, Azure Plate forms.
- Demonstrated skills in automating workflows and monitoring data processes.
- Ability to resolve performance and data flow issues efficiently.
Assets (Nonessential):
- Ideal understanding of machine learning capabilities and how implement.
- Experience with Tableau, Domo or other business intelligence tools such as Informatica ETL is an asset.
- Ability to learn fast and proven track of continuous applied learning.
- Knowledge of Informatica is an asset.
To be successful in this role you will be:
- Cooperative
- Insightful
- Intuitive
- Motivated self-learner
- Persuasive
Please note:
- This is a 1 year term role.
Closing Date: 10/15/2024, 11:59PM ET
Employee Group: USW
Appointment Type: Ancillary Operations
Schedule: Full-Time, 1 year term position
Pay Scale Group & Hiring Zone:
USW Pay Band 12 -- $79,874. with an annual step progression to a maximum of $102,147. Pay scale and job class assignment is subject to determination pursuant to the Job Evaluation/Pay Equity Maintenance Protocol.
Job Category: Engineering / Technical
Lived Experience Statement
Candidates who are members of Indigenous, Black, racialized and 2SLGBTQ+ communities, persons with disabilities, and other equity deserving groups are encouraged to apply, and their lived experience shall be taken into consideration as applicable to the posted position.
Tags: Azure Business Intelligence Computer Science Data governance Data pipelines Data strategy Data warehouse Engineering ETL Informatica Java Machine Learning Mathematics MySQL PHP Pipelines Python Research SQL Tableau Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.