Senior Data DevOps Engineer
Ra'anana, Israel
ZoomInfo
It’s our business to grow yours! Own your market with leading B2B contact data combined with sales intelligence, engagement software, and workflow tools.At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo.
We are looking for a DevOps Data Engineer who will play a critical role in designing, automating, and optimizing high-scale data infrastructure that supports real-time and batch data processing. This role is a hybrid between DevOps and Data Engineering, focusing on containerized data applications and cloud automation in a high-performance distributed environment.
You will work on cloud-native, high-scale data platforms by integrating DevOps principles with data engineering best practices. Technologies you will use include Kubernetes, Helm, Terraform, Kafka, Solr, Elasticsearch, and JVM-based applications (Spring Framework). If you enjoy working on high-throughput data systems, infrastructure automation, and scalable cloud solutions, this is the perfect opportunity!
Key Responsibilities
- Data Infrastructure & Cloud Deployment- Design, deploy, and manage high-scale, cloud-native data platforms on GCP, AWS.
- Automate provisioning, scaling, and maintenance of data services using Terraform, Helm, and Kubernetes.
- Optimize and manage distributed data platforms (Solr).
- Data Pipeline Reliability & Observability- Maintain and optimize real-time streaming and batch data pipelines using Kafka, DataFlow or PubSub .
- Implement monitoring, logging, and alerting using Datadog or Prometheus or OpenTelemetry.
JVM-Based Application Deployment - Deploy and manage JVM-based applications (Spring Framework, Java microservices) in containerized environments.
- Ensure reliable and scalable deployment of data processing services in Kubernetes clusters. CI/CD & Infrastructure as Code (IaC)
- Automate deployments using CI/CD pipelines (Jenkins, GitHub Actions, ArgoCD).
- Implement GitOps workflows for managing data infrastructure and applications.
- Security, Compliance & Data Governance
- Implement RBAC, encryption, and compliance policies for GDPR, CCPA, and SOC2.
- Automate data retention, backup, and disaster recovery strategies.
- Ensure auditability and traceability of data workflows.
Desired Skills and Experience
- 3+ years of experience in Infrastructure, DevOps, or Site Reliability Engineering (SRE) in high-scale production environments handling large-scale data systems.
- 3+ years of experience managing/owning a Kubernetes-based platform and deploying containerized applications at scale.
- Experience deploying and managing JVM-based applications (Spring Framework, Java microservices) in Docker and Kubernetes environments.
- Strong background in cloud-native data platforms, including experience with GCP or AWS.
- Expertise in Infrastructure as Code (IaC) tools such as Terraform and Helm for automating cloud infrastructure.
- Experience optimizing search and NoSQL databases such as HBase, Solr .
- Advanced knowledge of Linux OS and networking, including troubleshooting low-latency distributed systems.
- Deep experience in CI/CD automation using Jenkins, GitHub Actions or ArgoCD to manage data infrastructure and application deployments.
- Expertise in monitoring, observability, and performance tuning with tools like Datadog, OpenTelemetry or Prometheus and New Relic.
- Experience in public, private, and hybrid cloud architectures, preferably with AWS or GCP.
Preferred Qualifications (Nice to Have)
- Experience designing self-service tools for data infrastructure automation.
- Prior experience in Machine Learning infrastructure (MLOps) for AI-driven data pipelines.
- Knowledge of service mesh technologies such as Istio.
- Experience implementing cost-optimization strategies for large-scale data platforms.
- Experience with Python, Spark Streaming, Flink, Airflow,
Why Join Us?
Work on cutting-edge cloud and data infrastructure at scale.
Collaborate with top engineers in DevOps & Data Engineering.
Competitive salary, benefits, and career growth opportunities.
#LI-IO1
#LI-Hybrid
About us:
ZoomInfo (NASDAQ: ZI) is the Go-To-Market Intelligence Platform that empowers businesses to grow faster with AI-ready insights, trusted data, and advanced automation. Its solutions provide more than 35,000 companies worldwide with a complete view of their customers, making every seller their best seller.
ZoomInfo may use a software-based assessment as part of the recruitment process. More information about this tool, including the results of the most recent bias audit, is available here.
ZoomInfo is proud to be an equal opportunity employer, hiring based on qualifications, merit, and business needs, and does not discriminate based on protected status. We welcome all applicants and are committed to providing equal employment opportunities regardless of sex, race, age, color, national origin, sexual orientation, gender identity, marital status, disability status, religion, protected military or veteran status, medical condition, or any other characteristic protected by applicable law. We also consider qualified candidates with criminal histories in accordance with legal requirements.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS CI/CD Dataflow Data governance Data pipelines DevOps Distributed Systems Docker Elasticsearch Engineering Flink GCP GitHub HBase Helm Java Jenkins Kafka Kubernetes Linux Machine Learning Microservices ML infrastructure MLOps NoSQL Pipelines Python Security Spark Streaming Terraform
Perks/benefits: Career development Competitive pay
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.