BI Data Engineer
Philippines
Netskope
Netskope, a global SASE leader, helps organizations apply zero trust principles and AI/ML innovations to protect data and defend against cyber threats.Today, there's more data and users outside the enterprise than inside, causing the network perimeter as we know it to dissolve. We realized a new perimeter was needed, one that is built in the cloud and follows and protects data wherever it goes, so we started Netskope to redefine Cloud, Network and Data Security.
Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter@Netskope.
About the position:
As a Data Engineer on the Data & Analytics team at Netskope, you'll be tasked with maintaining our reporting data platform. This includes building extraction pipelines in Airflow, transforming that data in Snowflake, and monitoring our costs and permissions across our data tools. This will involve working with multiple external orgs within Netskope to ensure that we are pulling the correct data in a dependable manner, as well as working with our analysts and analytics engineers to understand the datasets and how they can best be utilized.
Responsibilities:
- Build data pipelines in Airflow to extract data from various source system into our Snowflake data warehouse
- Find ways to improve Snowflake query performance and reduce costs
- Own access control and permissions for all of our data tools, from GCP to Looker
- Work with the BI team on requirements to solve business data needs
Requirements:
- 3+ years experience with Python
- 2+ years extracting data with Airflow
- 3+ years experience with SQL
- Comfortable working with APIs
- Experience with ETL processes in a reporting environment
- Experience with DBT a plus
- Experience with data visualization (Looker) a plus
- Experience working with ML libraries
- Strong business intuition and ability to understand complex business systems
Education:
- Bachelor's degree (BSc) in a relevant field of studies preferred.
#LI-BL1
Netskope is committed to implementing equal employment opportunities for all employees and applicants for employment. Netskope does not discriminate in employment opportunities or practices based on religion, race, color, sex, marital or veteran statues, age, national origin, ancestry, physical or mental disability, medical condition, sexual orientation, gender identity/expression, genetic information, pregnancy (including childbirth, lactation and related medical conditions), or any other characteristic protected by the laws or regulations of any jurisdiction in which we operate.
Netskope respects your privacy and is committed to protecting the personal information you share with us, please refer to Netskope's Privacy Policy for more details.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Data pipelines Data visualization Data warehouse dbt ETL GCP Looker Machine Learning Pipelines Privacy Python Security Snowflake SQL
Perks/benefits: Team events Transparency
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.