Data Engineer
Philippines
Netskope
Netskope, a global SASE leader, helps organizations apply zero trust principles and AI/ML innovations to protect data and defend against cyber threats.Today, there's more data and users outside the enterprise than inside, causing the network perimeter as we know it to dissolve. We realized a new perimeter was needed, one that is built in the cloud and follows and protects data wherever it goes, so we started Netskope to redefine Cloud, Network and Data Security.
Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter@Netskope.
Position Summary
We are seeking a talented and motivated Data Engineer to join our team. In this role, you will design, build, and maintain scalable and efficient data pipelines, enabling our organization to make data-driven decisions. You will work closely with product owners, data analysts, BI teams, and other stakeholders to ensure data availability, reliability, and security across the organization.
Key Responsibilities
- Design, build, test, and maintain robust, scalable, and high-performance data pipelines for ingesting, transforming, and loading data from various sources (e.g., databases, APIs, cloud storage).
- Develop and optimize ETL/ELT workflows to integrate data from diverse sources into centralized repositories like data warehouses and data lakes.
- Build and optimize data models, schemas, and repositories to support operational, analytical, and reporting use cases.
- Implement and manage automated data quality checks and monitoring systems to ensure accuracy, consistency, and integrity.
- Work closely with BI teams, data scientists, analysts, and other stakeholders to understand business needs, translate requirements, and deliver scalable data solutions.
- Optimize data systems for performance, scalability, cost-efficiency, and reliability in batch and streaming environments.
- Create and maintain cloud-based infrastructure, leveraging CI/CD frameworks and automation best practices for deployment and operation.
- Embed end-to-end data security and privacy measures while adhering to regulatory standards and best practices.
- Proactively troubleshoot, resolve data-related issues, and refactor legacy systems to align with modern frameworks and infrastructure.
- Maintain comprehensive documentation for workflows and systems, actively contribute to team ceremonies, and stay updated on emerging data engineering technologies and best practices.
Qualifications
Required:
- Bachelor's or Master's degree in Computer Science, Engineering, or related field.
- 3+ years as a Data Engineer or in a similar role.
- Proficiency in Python and SQL, and potentially Java/Scala.
- Extensive experience with building, maintaining, and troubleshooting pipelines in Spark.
- Strong experience with cloud platforms (AWS, Azure, GCP) and their data tools.
- Expertise in building and managing ETL/ELT processes.
- Strong understanding of data modeling, warehousing, and architecture.
- Experience with tools like Airflow, dbt, Fivetran
- Strong analytical, problem-solving, communication, and teamwork abilities.
- Experience with data warehousing and data lake technologies (e.g., Snowflake, Redshift, Databricks, AWS S3).
- Fluent in English
Preferred:
- Master’s degree in Computer Science, Data Engineering, or a related field.
- Experience with data versioning, orchestration, and CI/CD for data pipelines.
- Knowledge of machine learning workflows and integrating data pipelines with ML models.
- Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker).
- Certifications in relevant technologies, such as AWS Certified Data Analytics or Google Cloud Professional Data Engineer.
About You
You are a detail-oriented, innovative, and self-driven professional with a passion for building scalable data solutions. You thrive in a collaborative environment, leveraging your expertise to solve complex data challenges and drive impactful outcomes. With a deep understanding of modern data technologies, you are committed to delivering high-quality, reliable, and efficient data infrastructure that supports business goals.
Netskope is committed to implementing equal employment opportunities for all employees and applicants for employment. Netskope does not discriminate in employment opportunities or practices based on religion, race, color, sex, marital or veteran statues, age, national origin, ancestry, physical or mental disability, medical condition, sexual orientation, gender identity/expression, genetic information, pregnancy (including childbirth, lactation and related medical conditions), or any other characteristic protected by the laws or regulations of any jurisdiction in which we operate.
Netskope respects your privacy and is committed to protecting the personal information you share with us, please refer to Netskope's Privacy Policy for more details.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture AWS Azure CI/CD Computer Science Data Analytics Databricks Data pipelines Data quality Data visualization Data Warehousing dbt ELT Engineering ETL FiveTran GCP Google Cloud Java Looker Machine Learning ML models Pipelines Power BI Privacy Python Redshift Scala Security Snowflake Spark SQL Streaming Tableau
Perks/benefits: Team events Transparency
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.