Data Engineer | Mid/Senior | Apps | NordVPN

Vilnius

Nord Security

Nord Security is the leader in the field of online privacy and security. Our security tools have earned praise from respected experts and tech outlets.

View all jobs at Nord Security

Apply now Apply later

Nord Security was born as a passion project, and our drive is reflected in our work, which has earned high praise from major tech outlets and cybersec experts. We want one thing only — to give true online privacy and security to as many people as we can. And for that purpose we create top-notch cybersecurity products and services that grant a safer cyber future to millions of users.
NordVPN is the fastest VPN and the most trusted online security solution on the planet. NordVPN protects your internet traffic with next-generation encryption, being the preferred tool of activists and privacy-conscious individuals around the globe.
The NordVPN Apps department believes in constant improvement and innovation, so it takes the initiative to refine all products at every stage. We’re actively involved in all phases of development with other teams to obtain the best outcomes – from the simplest UI elements to innovative features. Our apps team is all about hard work, modern technology stack, speed, a constant desire to learn, and above all, vigilance in keeping every last asset safe and sound. That's how we build top-notch cybersecurity solutions that people can trust. 
What will you do?- Acquire data from various data sources (APIs, relational and non-relational databases, queues …) by developing scripts, workflows, and ETL pipelines using our stack of both “small” and big data;- Participate in modeling business processes with data models;Maintain existing data models’ integrity and structure in the data warehouse;- Identify, design, and implement internal process improvements such as automating manual processes, and optimizing data delivery;- Assess the effectiveness and accuracy of data-gathering techniques;Develop and deploy processes and tools to monitor and analyze pipeline performance and data accuracy;- Discover opportunities for data acquisition, diagnostics, mapping, and correction;Employ a variety of development languages and tools to blend data systems together;- Recommend and validate different ways to improve data reliability, efficiency, and quality;- Troubleshooting the data pipeline;- Ad-hoc dataset creation;Work with other teams to understand their individual needs and objectives to enable them through data availability.
Core Requirements- 3+ years of experience performing data acquisition tasks; - Deep, hands-on experience in Python;- Strong knowledge of Apache Spark;- Knowledge of Git;- Knowledge of Bash;- Experience with Airflow is a plus.
Why should you pick this team?
- Big data (~6TB of  compressed data inflow per day)- Broad scope of work with an opportunity to contribute to different projects (developing ETL pipelines, developing internal tools and internally shared libraries, developing the data model, working on monitoring and stability, working on optimizations and stability of processes, processes automation, etc)
Salary RangeGross salary: 3400-7100 EUR/Month.
Apply now Apply later
Job stats:  1  0  0
Category: Engineering Jobs

Tags: Airflow APIs Big Data Data warehouse ETL Git Pipelines Privacy Python RDBMS Security Spark

Region: Europe
Country: Lithuania

More jobs like this