Senior Software Engineer, Platform

Remote, USA

People Data Labs

People Data Labs builds people data. Use our dataset of 1.5 Billion unique person profiles to build products, enrich person profiles, power predictive modeling/ai, analysis, and more.

View all jobs at People Data Labs

Apply now Apply later

Note for all engineering roles: With the rise of fake applicants and AI-enabled candidate fraud, we have built in additional measures throughout the process to identify such candidates and remove them.

About Us

People Data Labs (PDL) is the provider of people and company data. We do the heavy lifting of data collection and standardization so our customers can focus on building and scaling innovative, compliant data solutions. Our sole focus is on building the best data available by integrating thousands of compliantly sourced datasets into a single, developer-friendly source of truth. Leading companies across the world use PDL’s workforce data to enrich recruiting platforms, power AI models, create custom audiences, and more.

We are looking for individuals who can balance extreme ownership with a “one-team, one-dream” mindset. Our customers are trying to solve complex problems, and we only help them achieve their goals as a team. Our Platform Engineering Team oversees the foundational work that the rest of our engineering teams build their success upon. 

You will be crucial in accelerating our efforts to build standalone data products that enable data teams and independent developers to create innovative solutions at massive scale. In this role, you will be working with a team, defining tools and infrastructure that facilitate big data processing, primarily within AWS.

If you are looking to be part of a team discovering the next frontier of data-as-a-service (DaaS) with a high level of autonomy and opportunity for direct contributions, this might be the role for you. We like our engineers to be thoughtful, quirky, and willing to fearlessly try new things. Failure is embraced at PDL as long as we continue to learn and grow from it.

What You Get To Do

  • Manage and improve our growing AWS and data center infrastructures
  • Design, implement, and maintain a CI/CD pipeline to improve developer workflows
  • Utilize centralized monitoring and logging to improve visibility across the team
  • Assist development teams in solving issues around scaling and bottlenecks
  • Work with teammates to develop high-quality software, balancing security, reliability, and operational concerns

The Technical Chops You'll Need

  • 5-7+ years of software development experience with a background in platform or cloud infrastructure engineering and clear examples of strategic technical problem-solving and implementation
    • 3+ years of experience with Python in a production environment
    • Strong software development fundamentals and system design experience
  • Strong experience with our core technologies (AWS, ElasticSearch / OpenSearch,  Python, Docker, scaled data processing technologies)
    • AWS, including EC2, Lambda, OpenSearch, API Gateway, ALB, others
    • Data stores, including Postgres/MySQL, Dynamo, Redis, S3
  • Experience with Infrastructure-as-code (IaC) frameworks (e.g. Pulumi, Terraform, CloudFormation, or similar)
  • Experience with network design, including public/private availability, routing, firewalls/security groups, and VPN
  • Experience with Identity and Access Management
  • Experience with configuration management tools (e.g. Chef, Puppet, Ansible, etc)
  • Experience with observability tools such as Datadog for metrics, logging, etc
  • Experience with build and deploy systems, architecting and developing CI/CD infrastructure, repo management, and integrating with tools like Github Actions (or similar)

People Thrive Here Who Can

  • Balance high ownership and autonomy with a strong ability to collaborate
  • Can work effectively remotely (able to be proactive about managing blockers, proactive on reaching out and asking questions, and participating in team activities)
  • Strong written communication skills on Slack/Chat and in documents
  • Are experienced in writing data design docs (pipeline design, dataflow, schema design)
  • Can scope and breakdown projects, communicate and collaborate progress and blockers effectively with your manager, team, and stakeholders

Some Nice To Haves

  • Degree in a quantitative discipline such as computer science, mathematics, statistics, or engineering
  • Expertise with Apache Spark (Java, Scala, and/or Python-based)
  • Experience with SQL Data Pipeline Development
  • Experience supporting developer-oriented data pipeline and workflow orchestration (e.g., Airflow (preferred), dbt, dagster, or similar) 
  • Experience with managing, deploying, and ensuring the reliability of streaming platforms (e.g., Kafka)
  • Experience evaluating data quality and maintaining consistently high standards across new feature releases (e.g., consistency, accuracy, validity, completeness)
  • Experience using Databricks or similar data-development platforms
  • Experience managing hybrid environments split between local datacenters and AWS; experience managing bare metal/co-location infrastructure

Our Benefits

Great people make great teams. We believe in building highly functional, energetic, and engaging teams to serve our customers. People, Customers, Shareholders, in that order, sets us up for success and delivering on our promises.

  • Stock
  • Competitive Salaries
  • Unlimited paid time off
  • Medical, dental, & vision insurance
  • Health, fitness, and office stipends
  • The permanent ability to work wherever and however you want

Comp: $160K - $180K

No C2C, 1099, or Contract-to-Hire. Recruiters need not apply.

People Data Labs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits.

Qualified Applicants with arrest or conviction records will be considered for Employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act.

Personal Privacy Policy For California Residents

https://privacy.peopledatalabs.com/policies?name=personnel-privacy-policy

 

Apply now Apply later
Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Ansible APIs AWS Big Data CI/CD CloudFormation Computer Science Dagster Databricks Dataflow Data quality dbt Docker EC2 Elasticsearch Engineering GitHub Java Kafka Lambda Mathematics MySQL OpenSearch PostgreSQL Privacy Puppet Python Scala Security Spark SQL Statistics Streaming Terraform

Perks/benefits: Career development Health care Insurance Team events Unlimited paid time off

Regions: Remote/Anywhere North America
Country: United States

More jobs like this