Data Engineer

Vilnius

Oxylabs

The best proxy service platform with 100M+ Residential and 2M Datacenter IP proxies. Extract public data from any website with ease!

View all jobs at Oxylabs

Apply now Apply later

We’re a team of 500+ professionals who develop cutting-edge proxy and web data scraping solutions for thousands of the world’s best known businesses, including Fortune 500 companies.
What’s in store for you:As our new Data Engineer you will be responsible for addressing a diverse and challenging range of problems to help us make better business decisions at Oxylabs.io
You'll create and maintain data ingestion pipelines and transformation models, and help create and maintain data models so our team can easily access and analyze data. When something looks off in the data you’ll be looking for ways to fix or mitigate any issues, so the team is always working with quality data. You will also be in charge of making sure our data processes are clear, documented, easy to maintain and debug. We are a small team with a focus on self-ownership, so you will have the freedom and opportunity to come up with and implement the best solution for any given situation.
Does this role sound well suited to you? Send us an application - we'd love to hear from you.

Your day-to-day:

  • Develop data ingestion pipelines for new data sources
  • Own our data ingestion and transformation tools and suggest ways to improve them
  • Monitor our data pipeline costs and look for ways to optimize them
  • Design and implement data transformation jobs using dbt to ensure data quality and format meets business requirements
  • Review, optimize, and refactor existing data models to improve data quality and accessibility
  • Occasionally create dashboards or alerting solutions to monitor data quality or process efficiency

What we expect:

  • 3-5 years experience in a data engineer / analyst engineer role
  • Good knowledge of SQL
  • Good knowledge of Python
  • Experience with data modelling and ELT processes
  • Ability to troubleshoot data processes, and identify issues or areas of improvement
  • Ability to work independently, research new tools and new ways to optimize existing process

  • NICE TO HAVE REQUIREMENTS:
  • Experience working with dbt / Airflow / Google Cloud
  • Experience working with some of the various data analysis and visualization tools (Looker, Tableau, Qlik, PowerBI, Grafana)

Salary:

  • Gross salary: 4500 - 6000 EUR/month. Keep in mind that we are open to discussing a different salary based on your skills and experience.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Data analysis Data quality dbt ELT GCP Google Cloud Grafana Looker Pipelines Power BI Python Qlik Research SQL Tableau

Region: Europe
Country: Lithuania

More jobs like this