Data Engineering Team Lead
Remote, Canada
Geoforce
Geoforce provides visibility across your field operations with real-time GPS asset tracking technology. Get started now.Since 2007, Geoforce has been an industry leader in GPS tracking, providing reliable solutions for over 250,000 assets across 90+ countries. Our comprehensive platform combines rugged GPS devices with advanced software, supported by global satellite and cellular networks. From vehicles to heavy equipment, we deliver the visibility and control businesses need to optimize operations. Trusted by industry leaders like Southwest Airlines, DHL, and SLB (Schlumberger), Geoforce is the go-to partner for smarter asset management. Learn more at www.geoforce.com.
As a rapidly growing company committed to technology innovation and delivering high value services to its clients, Geoforce is constantly looking for high integrity, well-rounded professionals who thrive on challenges, are fascinated by technology, exhibit passion and pride, and don't mind rolling up their sleeves to get a job done.
What We NeedWe are looking for a Data Engineering Team Lead who can serve as a player-coach, balancing hands-on technical work with team leadership. In this role, you’ll guide a team in designing and implementing next-generation data infrastructure while directly contributing to critical projects. You’ll provide strategic direction, ensure alignment with broader business goals, and foster a growth-oriented environment. The ideal candidate will be both technically proficient and an empathetic leader, able to inspire and develop a high-performing data team.
The Team Lead will play five critical roles:
Senior Engineer: You will work as part of the development team, taking design and coding tasks to deliver features.
People Manager: You will promote continuous learning, support career development, and foster a high-performing team environment that thrives on collaboration and skill growth.
Product Trio Partner: Collaborating closely with Product Management, you will play a central role in discovery and innovation, ensuring clarity and alignment in delivering customer-focused solutions.
Technical Lead: You will guide and enforce best practices in architecture, engineering, and agile processes while managing and reducing technical debt.
Delivery Lead: You will own analysis, design, planning and execution of delivery and discovery initiatives for your team.
Job Duties
Hands-On Leadership: Act as a player-coach by balancing technical responsibilities with team leadership. Provide mentorship and support for engineers, while also taking development responsibility of key data engineering initiatives.
Data Architecture and Design: Lead the design, development, and deployment of highly scalable and reliable data infrastructure using modern tools like Snowflake, Databricks, and cloud-based services. Continuously refine the data architecture to meet evolving business and technical needs.
ETL and Data Pipelines: Design, build, maintain, and optimize data pipelines and ETL processes using tools like Airbyte and Fivetran for efficient data ingestion, transformation, and storage. Ensure these pipelines are robust, scalable, and production ready.
Collaboration: Work closely with cross-functional teams including data analysts, product managers, software engineers, and business stakeholders to understand data requirements and deliver high-value data solutions. Utilize Looker and PowerBI to provide actionable insights and drive decision-making.
Infrastructure Management: Oversee the management of data infrastructure using modern cloud-based environments (AWS preferred). Implement best practices for infrastructure as code, configuration management, and monitoring (e.g., Terraform, Ansible, Datadog).
Technical Debt Management: Proactively manage and mitigate technical debt within the data platform, balancing the need for innovation with system stability and scalability.
Innovation and Growth: Drive innovation in data engineering by experimenting with new technologies, techniques, and tools. Cultivate a culture of learning and continuous improvement within the team.
Data Governance and Quality: Establish and maintain data governance standards to ensure the integrity, consistency, and accuracy of data. Implement techniques for monitoring data infrastructure reliability and correctness.
Production Operations: Take ownership of the operational reliability of data systems, including on-call support for critical data infrastructure components.
Critical
Hands-on experience with building and operating data pipelines, ETL frameworks, and data infrastructure in production environments.
Expertise in Airbyte, Fivetran, Snowflake, and Databricks for data warehousing, data integration, and analytics.
Strong proficiency in SQL and working with large datasets, with experience building data models for Looker and PowerBI.
Expertise in Apache Kafka for real-time data streaming and event-driven architectures.
Experience with Infrastructure as Code and configuration management tools like Terraform, Ansible, and monitoring tools like Datadog.
Proficient in Python, with experience in other languages like Scala.
Experience with containers and orchestration tools such as Docker and Kubernetes.
Strong understanding of data modeling, including data lakes, relational databases, NoSQL, and OLAP cubes.
Nice to Have
Familiarity with AI/ML pipelines and AWS services like SageMaker or Lambda for scalable model deployment.
A strong interest in driving data governance initiatives and ensuring data quality.
8+ years of experience as a hands-on data engineer, demonstrating deep technical expertise in building complex, scalable data solutions.
3+ years of experience leading data engineering teams or initiatives, demonstrating strategic thinking, mentoring, and advising stakeholders.
Bachelor’s degree in Computer Science, Data Science, or a related field.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Ansible Architecture AWS Computer Science Databricks Data governance Data pipelines Data quality Data Warehousing Docker Engineering ETL FiveTran Kafka Kubernetes Lambda Looker Machine Learning Model deployment NoSQL OLAP Pipelines Power BI Python RDBMS SageMaker Scala Snowflake SQL Streaming Terraform
Perks/benefits: Career development Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.