Data Engineer - AWS, Redshift, Python & SQL

Stockholm, Stockholm County, Sweden

Sinch

Unlock meaningful conversations across the customer journey with programmable SMS, voice, email, video, & verification APIs!

View all jobs at Sinch

About Us 

Sinch is a global leader in cloud communications, helping businesses connect with their customers on mobile phones. We facilitate over 147 billion conversations annually, reaching every phone on Earth. Our technology powers the world's leading communication platforms. 

We are a dynamic and innovative team seeking a passionate Data Engineer to collaborate on our data pipeline and data warehousing solutions. 

The working model is hybrid, and you can work from our Stockholm or Malmo office. 

Responsibilities 

  • Develop and maintain scalable, resilient data pipelines using AWS services such as Lambda, Step Functions, Redshift, and S3. 
  • Improve and maintain the Redshift data warehouse solution. 
  • Manage the data lifecycle from ingestion and processing to storage and retrieval. 
  • Work with a range of different technologies and programming languages, with a primary focus on Python and SQL. 
  • Actively contribute to enhancing our data processes and pipelines. 
  • Take a hands-on approach to monitoring production environments, identifying issues, and being on-call when necessary. 
  • Engage closely with data scientists, software engineers, architects, DevOps specialists, and product managers to deliver exceptional data solutions. 
  • Build and maintain visualizations and dashboards in Tableau. 

Requirements 

  • Capable of developing, maintaining, and optimizing data pipelines for enhanced performance and reliability. 
  • Experienced professional in team-based software development and deployment. 
  • Skilled in using SQL for data querying and ETL/ELT development. 
  • Experienced in data warehousing (Redshift) and data modelling.  
  • In-depth knowledge of AWS services including Lambda, Step Functions, Redshift, and S3. 
  • Ability to work effectively within a cross-functional self-organizing team, fostering collaboration and mutual respect among team members. 
  • A passion for writing clean and efficient code. 
  • Strong written and verbal communication skills in English. 
  • Bachelor’s degree in computer science, data engineering, or related technical discipline. 

 

Good to have 

  • Knowledgeable about various data architectures such as Data Lake, Data Mesh, and Data Fabric. 
  • Experience in data visualization tools. 
  • Knowledge in or willing to learn infrastructure as code using Terraform. 
  • Experience in a similar industry or related one. 
  • Proven ability to work with others to solve tough technical problems. 
  • Ability to thrive in a dynamic and fast-paced work environment characteristic of agile development methodologies. 
  • Experience guiding and mentoring fellow engineers, empowering them to contribute effectively to team goals. 

Embrace the challenge and join us! 

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  4  0  0
Category: Engineering Jobs

Tags: Agile Architecture AWS Computer Science Data pipelines Data visualization Data warehouse Data Warehousing DevOps ELT Engineering ETL Lambda Pipelines Python Redshift SQL Step Functions Tableau Terraform

Region: Europe
Country: Sweden

More jobs like this