Senior Data Engineer
Remote, Canada
Applications have closed
Who We Are
Founded in 2007, Geoforce is a rapidly growing technology firm (#43 on the 2020 Dallas 100 list for fastest growing DFW private companies) that provides GPS-based tracking and monitoring of field equipment, vehicles, and other assets to over 1,300 companies in 90 countries. Our turnkey solution is delivered via a combination of our award-winning web-based software platform, rugged GPS tracking devices and a global satellite and cellular network. Prominent customers include American Airlines, DHL, and Schlumberger. For more information, please see www.geoforce.com.
As a rapidly growing company committed to technology innovation and delivering high value services to its clients, Geoforce is constantly looking for high integrity, well-rounded professionals who thrive on challenges, are fascinated by technology, exhibit passion and pride, and don't mind rolling up their sleeves to get a job done.
What We Need
Our software team is looking to add a Senior Data Engineer that will join in our work to add desirable features for our customers, expand systems capacity, and improve our data analysis platform. As a part of our team you will deploy to a cloud-based infrastructure on AWS and build data infrastructure for use internally and by our business partners.
Job Duties
Design, build, deploy, and operate next generation data infrastructure.
Develop techniques for monitoring the correctness and reliability of our data infrastructure
Provide technical leadership via knowledge and understanding of software design and architecture.
Leverage agile practices, encourage collaboration, prioritization, and urgency to develop at a rapid pace
Contribute to the data team vision to build and evolve the team’s data infrastructure and tools.
Design, build, maintain, launch and optimize data models and data pipelines
Ensure production quality methods to retrieve, condition, validate, and manipulate data
Full-stack build custom integrations between cloud-based systems using APIs
Continuously refine and improve data architecture and delivery as new requirements surface
Build cross-functional partnerships with data analysts, product managers, software engineers, and business partners to understand the data and tools needed and deliver on those needs.
Seek varied perspectives to drive innovation and build consensus across team members
Knowledge and Skills
You've worked extensively with Data ETL and Warehousing in production
You have excellent SQL skills and have worked with change data capture systems in production
You have experience working with Infrastructure as Code, configuration management, and monitoring tools. Our team uses Terraform, Ansible, and Datadog.
You've worked with Apache Kafka in production
You have some experience working with languages like Python or Ruby.
You care about work-life balance and want your company to care about it too; you'll put in the extra hour when needed but won't let it become a habit.
You want to work with a high degree of autonomy, while at the same time working on initiatives of high importance to the company.
Education and Work Experience
Experience with data warehouse technology is desirable, particularly with Snowflake.
Experience with data integration platforms such as Fivetran, Stich, Airbyte.
Experience in visualization tools like Looker, Tableau, or PowerBI
Experience with containers and container orchestration tools. Docker and Kubernetes experience is desirable.
Experience with architecting and building environments to experiment with data
Experience in data store design (data lakes; relational, columnar, NoSQL databases, analytics/OLAP cubes)
AWS and DevOps experience with AI/ML pipelines
Strong expertise in one or more languages such as Java, Python, and SQL
Bachelors in Computer Science or other quantitative fields.
Experience working across all levels of the development stack
#LI-Remote
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Ansible APIs Architecture AWS Computer Science Data analysis Data pipelines Data warehouse DevOps Docker ETL FiveTran Java Kafka Kubernetes Looker Machine Learning NoSQL OLAP Pipelines Power BI Python Ruby Snowflake SQL Tableau Terraform
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.