Senior Data Engineer (Remote)

San Francisco

AllTrails

Search over 450,000 trails with trail info, maps, detailed reviews, and photos curated by millions of hikers, campers, and nature lovers like you.

View all jobs at AllTrails

Apply now Apply later

About AllTrails
AllTrails is the most trusted and used outdoors platform in the world. We help people explore the outdoors with hand-curated trail maps along with photos, reviews, and user recordings crowdsourced from our community of millions of registered hikers, mountain bikers, and trail runners in 150 countries. AllTrails is frequently ranked as a top-5 Health and Fitness app and has been downloaded by over 75 million people worldwide.
Every day, we solve incredibly hard problems so that we can get more people outside having healthy, authentic experiences and a deeper appreciation of the outdoors. Join us!  
This is a U.S.-based remote position. San Francisco Bay Area employees are highly encouraged to come into the office one day a week.

What You’ll Be Doing:

  • 5+ years of experience working in data engineering
  • Expertise in using both SQL and Python for data cleansing, transformation, modeling, pipelining, etc.
  • Proficiency in working with high volume datasets in SQL-based warehouses such as BigQuery, Redshift, Snowflake, or others, preferably using ELT tools like Dataform or dbt
  • Experience with parallelized data processing frameworks such as Google Dataflow, Apache Spark, etc.
  • Proficiency in working with other stakeholders and converting requirements into detailed technical specifications; owning projects from inception to completion
  • Deep understanding of data modeling, access, storage, caching, replication, and optimization techniques
  • Ability to orchestrate data pipelines through tools such as Apache Airflow
  • Experienced in container and pod orchestration (e.g. Docker, Kubernetes)Understanding of the software development lifecycle and CI/CD
  • Monitoring and metrics-gathering (e.g. Datadog, NewRelic, Cloudwatch, etc)
  • Proficiency with git and working collaboratively in a shared codebase
  • Willingness to participate in an on-call support rotation - currently the rotation is monthly
  • Self motivation and a deep sense of pride in your workPassion for the outdoors
  • Comfort with ambiguity, and an instinct for moving quickly
  • Humility, empathy and open-mindedness - no egos

Requirements:

  • Proficient in working with other stakeholders and converting requirements into detailed technical specifications; owning projects from inception to completion
  • Expertise both in using SQL and Python for data cleansing, transformation, modeling, pipelining, etc.
  • Proficiency in working with high volume datasets in SQL-based warehouses such as BigQuery, Redshift, Snowflake, or others, preferably using ELT tools like Dataform or dbt
  • Experience with parallelized data processing frameworks such as Google Dataflow, Apache Spark, etc.
  • Deep understanding of data modeling, access, storage, caching, replication, and optimization techniques
  • Ability to orchestrate data pipelines through tools such as Apache Airflow
  • Experienced in container and pod orchestration (e.g. Docker, Kubernetes)
  • Understanding of the software development lifecycle and CI/CD
  • Monitoring and metrics-gathering (e.g. Datadog, NewRelic, Cloudwatch, etc)
  • Willingness to participate in an on-call support rotation - currently the rotation is monthly
  • Proficiency with git and working collaboratively in a shared codebase
  • Excellent documentation skills
  • Self motivation and a deep sense of pride in your work
  • Passion for the outdoors
  • Comfort with ambiguity, and an instinct for moving quickly
  • Humility, empathy and open-mindedness - no egos

Bonus Points:

  • Experience working in a multi-cloud environment
  • Experience working with a data stack in Google Cloud Platform
  • Experience with Amplitude
  • Experience with infrastructure-as-code, such as Terraform
  • Experience with machine learning frameworks and platforms such as VertexAI, SageMaker, MLFlow, or related frameworks

What We Offer:

  • A competitive and equitable compensation plan. This is a full-time, salaried position that includes equity
  • Physical & mental well-being including health, dental and vision benefits
  • Trail Days: No meetings first Friday of each month to go test the app and explore new trails!
  • Unlimited PTO
  • Flexible parental leave 
  • Remote employee equipment stipend to create a great remote work environment
  • Annual continuing education stipend
  • Discounts on subscription and merchandise for you and your friends & family
  • An authentic investment in you as a human being and your career as a professional
Nature celebrates you just the way you are and so do we! At AllTrails we’re passionate about nurturing an inclusive workplace that values diversity. It’s no secret that companies that are diverse in background, age, gender identity, race, sexual orientation, physical or mental ability, ethnicity, and perspective are proven to be more successful. We’re focused on creating an environment where everyone can do their best work and thrive.
AllTrails participates in the E-Verify program for all remote locations.By submitting my application, I acknowledge and agree to AllTrails' Job Applicant Privacy Notice.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Amplitude BigQuery CI/CD Dataflow Data pipelines dbt Docker ELT Engineering GCP Git Google Cloud Kubernetes Machine Learning MLFlow Pipelines Privacy Python Redshift SageMaker Snowflake Spark SQL Terraform Vertex AI

Perks/benefits: Career development Competitive pay Equity / stock options Fitness / gym Flex hours Flex vacation Health care Home office stipend Parental leave Salary bonus Unlimited paid time off

Regions: Remote/Anywhere North America
Country: United States

More jobs like this