Analytics Engineer (dbt)
Canada
⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️
Tucows
Tucows offers Domain Name Services, Fiber Internet Services and SaaS through our businesses Tucows Domains, Ting and Wavelo. We're a tech company headquartered in Toronto, Canada, making the internet better since 1993.Tucows (NASDAQ:TCX, TSX:TC) is possibly the biggest Internet company you’ve never heard of. We started as a simple shareware site in 1993 and have since grown into a stable of businesses: Tucows Domains, Ting Internet and Wavelo.
We embrace a people-first philosophy that is rooted in respect, trust, and flexibility. We believe that whatever works for our employees is what works best for us. It’s also why the majority of our roles are remote-first, meaning you can work from anywhere you can connect to the Internet!
Today, over one thousand people work in over 20 countries to help us make the Internet better. If this sounds exciting to you, join the herd!
About the opportunity
We are seeking an experienced and versatile Analytics Engineer to join our dynamic team. In this role, you will apply your advanced analytics expertise to extract actionable insights from raw data. The ideal candidate will have a strong background in data engineering, analytics, and machine learning, with the ability to drive data-driven decision-making across the organization.
What You Will Be Doing
- Data Modeling & Pipelines: Design, develop, and maintain complex data models in our Snowflake data warehouse. Utilize dbt (Data Build Tool) to create efficient data pipelines and transformations for our data platform.
- Snowflake Intelligence Integration: Leverage Snowflake Intelligence features (e.g., Cortex Analyst, Cortex Agents, Cortex Search, AISQL) to implement conversational data queries and AI-driven insights directly within our data environment. Develop AI solutions that harness these capabilities to extract valuable business insights.
- Advanced SQL & Analysis: Design and build advanced SQL queries to retrieve and manipulate complex data sets. Dive deep into large datasets to uncover patterns, trends, and opportunities that inform strategic decision-making.
- Business Intelligence (BI): Develop, maintain, and optimize Looker dashboards and LookML to effectively communicate data insights. Leverage Looker’s conversational analytics and data agent features to enable stakeholders to interact with data using natural language queries.
- Cross-Functional Collaboration: Communicate effectively with stakeholders to understand business requirements and deliver data-driven solutions. Identify opportunities for implementing AI/ML/NLP technologies in collaboration with product, engineering, and business teams.
- Programming & Automation: Write efficient Python code for data analysis, data processing, and automation of recurring tasks. Skilled in shell scripting and command-line tools to support data workflows and system tasks. Ensure code is well-tested and integrated into automated workflows (e.g., via Airflow job scheduling).
- Visualization & Presentation: Create compelling visualizations and presentations to deliver analytical insights and actionable recommendations to senior management and cross-functional teams. Tailor communication of complex analyses to diverse audiences.
- Innovation & Best Practices: Stay up-to-date with industry trends, emerging tools, and best practices in data engineering and analytics (with a focus on dbt features, Snowflake’s latest offerings and BI innovations). Develop and implement innovative ideas to continuously improve our analytics stack and practices.
What You Bring
- Education: Bachelor’s degree in Computer Science, Statistics, or a related field; Master’s degree preferred.
- Experience: 2+ years of experience in data analytics or a related field, with significant exposure to AI and Machine Learning applications in analytics.
- SQL Expertise: Advanced SQL skills with experience in writing and optimizing complex queries on large-scale datasets.
- dbt Proficiency: Hands-on experience with dbt (Data Build Tool) and its features for building, testing, and documenting data models.
- Data Modeling: Expert-level knowledge of data modeling and data warehouse concepts (e.g., star schema, normalization, slowly changing dimensions).
- Snowflake & AI Capabilities: Experience with Snowflake’s Data Cloud platform and familiarity with its advanced AI capabilities (Snowflake Intelligence – Cortex Analyst, Cortex Agents, Cortex Search, AISQL, etc.) is highly preferred.
- Business Intelligence Tools: Strong skills in Looker data visualization and LookML (including familiarity with Looker’s conversational AI and data agent capabilities) or similar BI tools.
- AI Agents & Automation: Experience with AI agents or generative AI tools to optimize workflows and service delivery (such as creating chatbots or automated analytic assistants) is a plus.
- Real-Time & Streaming Data: Experience with real-time data processing and streaming technologies (e.g., Kafka, Kinesis, Spark Streaming) for handling continuous data flows.
- Programming: Proficient in Python for data analysis and manipulation (pandas, NumPy, etc.), with the ability to write clean, efficient code. Experienced with shell scripting and command-line tools for automating workflows and data processing tasks.
- ETL/Orchestration: Familiarity with ETL processes and workflow orchestration tools like Apache Airflow (or similar scheduling tools) for automating data pipelines alongside Docker for local development and testing.
- Cloud Platforms: Experience with cloud platforms and services (especially AWS or GCP) for data storage, compute, and deployment.
- Version Control & CI/CD: Solid understanding of code versioning (Git) and continuous integration/continuous deployment (CI/CD) processes in a data engineering context.
- Agile Methodology: Familiarity with agile development methodologies and ability to work in a fast-paced, iterative environment.
- Soft Skills: Excellent communication and presentation skills, with critical thinking and problem-solving abilities. Proven track record of working effectively on cross-functional teams and translating business needs into technical solutions.
- Data Governance & Ethics: Experience implementing data governance best practices, ensuring data quality and consistency. Knowledge of data ethics, bias mitigation strategies, and data privacy regulations (e.g., GDPR, CCPA) with a commitment to compliance.
Nice to Have
- Community & Open Source: Contributions to open-source projects or active participation in data community initiatives.
- AI/ML Skills: Experience with applying Artificial Intelligence/Machine Learning techniques in analytics (e.g., building predictive models for forecasting, churn prediction, fraud detection, etc.). Practical experience deploying models and using MLOps/DataOps practices for lifecycle management.
- Statistical Background: Solid foundation in statistics and probability, with ability to apply various modeling techniques and design A/B tests or experiments.
- Additional Programming: Knowledge of additional programming or query languages (e.g., R, Scala, Julia, Spark SQL) that can be applied in analytics workflows.
- Certifications: Certifications in relevant data technologies or cloud platforms (such as Snowflake, AWS, GCP, or Looker) demonstrating your expertise.
The ideal candidate will be a self-starter with a passion for data and analytics, capable of navigating complex datasets to uncover valuable insights. They should be comfortable working in a fast-paced environment, adapting to new technologies, and driving innovation in our data practices.They should be able to navigate complex data landscapes, uncover meaningful insights, and communicate these findings effectively to both technical and non-technical audiences. The ability to stay current with industry trends and continuously learn new technologies is essential in this role.
This role offers the opportunity to make a significant impact on our organization's data strategy and contribute to critical business decisions through advanced analytics. The successful candidate will play a key role in shaping our data culture and driving the adoption of cutting-edge data technologies and methodologies.
If you are a data enthusiast with a track record of delivering impactful analytics solutions and a desire to push the boundaries of what's possible with data, we want to hear from you!
The base salary range for this position is $73,440 to $86,400. Range shown in $CAD for Canadian residents. Other countries will differ. Range may vary on a number of factors including, but not limited to: location, experience and qualifications. Tucows believes in a total rewards offering that includes fair compensation and generous benefits.
Want to know more about what we stand for? At Tucows we care about protecting the open Internet, narrowing the digital divide, and supporting fairness and equality.
We also know that diversity drives innovation. We are committed to inclusion across race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status or disability status. We celebrate multiple approaches and diverse points of view.
We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Tucows and its subsidiaries participate in the E-verify program for all US employees.
Learn more about Tucows, our businesses, culture and employee benefits on our site here. #LI-Remote #LI-NA1
Tags: A/B testing Agile Airflow AWS Business Intelligence CAD Chatbots CI/CD Computer Science Conversational AI Data analysis Data Analytics Data governance DataOps Data pipelines Data quality Data strategy Data visualization Data warehouse dbt Docker Engineering ETL GCP Generative AI Git Julia Kafka Kinesis Looker LookML Machine Learning MLOps NLP NumPy Open Source Pandas Pipelines Privacy Python R Scala Shell scripting Snowflake Spark SQL Statistics Streaming Testing
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.