Senior Data Engineer - Kotlin

Remote (United States)

Apply now Apply later

Who We Are

Tensure is a modern tech consultancy that tackles Cloud, Applications, and Data with open-handed expertise. We believe great work starts with teamwork so we’ve reimagined collaborative tech consulting. With a people-first approach and tangible solutions, our clients can do what makes them great while we optimize their tech stack. We’re clever and pragmatic but we also embrace the magic that happens when personality collides with technical excellence. Because ultimately, innovation isn’t boring (and neither are we).

Where You'll Come In

We are looking for a talented Senior or Lead Data Engineer with deep experience in data engineering and data warehousing best practices with software engineering experience. Collaboration with fellow engineers and developers across various infrastructure layers is a key aspect of this role, necessitating a commitment to collaborative problem-solving, thoughtful design, and the creation of high-quality products. 

What you'll do

We are looking for candidates with deep experience in data engineering and data warehousing best practices with software engineering experience.
Responsibilities:
  • Design, develop, and maintain scalable data pipelines and ETL processes to support our data warehouse infrastructure.
  • Experience with Kotlin, Python, SQL, Opensearch, GraphQL, Jooq
  • Collaborate with cross-functional teams to understand data requirements and implement solutions that meet business needs.
  • Perform data modeling to design and optimize database structures for performance and efficiency.
  • Write complex SQL queries for data extraction, transformation, and analysis.
  • Work with cloud platforms (GCP, AWS or Azure)
  • Monitor and optimize data processes for reliability, scalability, and performance.

Requirements

  • 5-7 years of professional experience in data engineering or a related role,  additional software engineering experience is a plus.
  • Experience with ETL processes and tools. Proficiency in data modeling concepts and technologies. 
  • Proficiency in SQL for data manipulation and analysis.
  • Hands-on experience with cloud platforms, preferably Google Cloud Platform (GCP) with
  • BigQuery and AWS with Redshift.
  • Strong skills in Kotlin or similar programming languages, and experience with Python.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills.
Some of our Perks
  • Medical, dental, vision & prescription benefits starting day 1!
  • Company paid short-term/long-term disability, AD&D and life insurance
  • We contribute 3% of your base salary to a 401k (regardless of your contribution)
  • 5 weeks Paid Time Off + 11 Company Holiday
  • A transparent pay structure with a clear path to promotion
Understanding the interview journey
  • An initial screening interview
  • A technical interview (pairing - code challenge, case studies and/or hypothetical questions about how you would solve certain challenges)
  • An interview with the team pertinent to your role
  • A culture and values interview
  • Offer Letter sent via email (or a decline with feedback)
Salary: Senior  - $150,000, Lead - $165,000 
Apply now Apply later
  • Share this job via
  • 𝕏
  • or
Job stats:  1  0  0
Category: Engineering Jobs

Tags: AWS Azure BigQuery Consulting Data pipelines Data warehouse Data Warehousing Engineering ETL GCP Google Cloud GraphQL OpenSearch Pipelines Python Redshift SQL

Perks/benefits: 401(k) matching Health care Insurance

Regions: Remote/Anywhere North America
Country: United States

More jobs like this