Junior Data Engineer (Remote US or Canada)
United States
Sayari
Get instant access to public records, financial intelligence and structured business information on over 455 million companies worldwide.Our company culture is defined by a dedication to our mission of using open data to enhance visibility into global commercial and financial networks, a passion for finding novel approaches to complex problems, and an understanding that diverse perspectives create optimal outcomes. We embrace cross-team collaboration, encourage training and learning opportunities, and reward initiative and innovation. If you like working with supportive, high-performing, and curious teams, Sayari is the place for you.
Job Description:Sayari’s flagship product, Sayari Graph, provides instant access to structured business information from billions of corporate, legal, and trade records. As a member of Sayari's data team you will work with the Product and Software Engineering teams to collect data from around the globe, maintain existing data pipelines, and develop new pipelines that power Sayari Graph.
Job Responsibilities:
- Write and deploy crawling scripts to collect source data from the web
- Write and run data transformers in Scala Spark to standardize bulk data sets
- Write and run modules in Python to parse entity references and relationships from source data
- Diagnose and fix bugs reported by internal and external users
- Analyze and report on internal datasets to answer questions and inform feature work
- Work collaboratively on and across a team of engineers using agile principles
- Give and receive feedback through code reviews
Skills & Experience:
- Professional experience with Python and a JVM language (e.g., Scala, Java, Kotlin)
- 2+ years of experience designing and maintaining data pipelines
- Experience using Apache Spark and Apache Airflow
- Experience with SQL and NoSQL databases (e.g., columns stores, graph, etc.)
- Experience working on a cloud platform like GCP, AWS, or Azure
- Experience working collaboratively with Git
- Understanding of Docker/Kubernetes
- Interest in learning from and mentoring team members
- Experience supporting and working with cross-functional teams in a dynamic environment
- Passionate about open source development and innovative technology
- Experience working with BI tools like BigQuery and Superset is a plus
- Understanding of knowledge graphs is a plus
Tags: Agile Airflow AWS Azure BigQuery Data pipelines Docker Engineering GCP Git Java Kubernetes NoSQL Open Source Pipelines Python Scala Spark SQL Superset Transformers
Perks/benefits: 401(k) matching Career development Competitive pay Equity / stock options Flex vacation Health care Insurance Medical leave Parental leave Startup environment Transparency
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.