Big Data Expert

Petah Tikva, IL

OSR Enterprises AG

OSR Enterprises is the new-age Tier1 supplier to the automotive industry, a speedboat for development teams at any car manufacturer.

View all jobs at OSR Enterprises AG

Apply now Apply later

Description

We are interested in welcoming a Big Data Expert to our diligent Big Data team and take overall responsibility for developing, executing, and maintaining strategy and workflow while anticipating possible consequences, changes, and trends. 

We believe that the perfect candidate possesses excellent leadership, communication, and social skills to build an effective and motivated team, utilizing the full potential of the skills and expertise of each member.

 

Responsibilities and tasks include:

  • Implementation, tuning and ongoing administration of Big Data infrastructure on Hadoop cluster and other NoSQL platforms
  • Managing end-to-end availability, monitoring performance, and planning capacity using a variety of open source and developed toolsets
  • Research and perfect best practices to support implementation of new features and solutions in Big Data and NoSQL space
  • Perform all levels of DBA support (e.g., plan and coordinate patching/upgrades)
  • Manage database backup and recovery
  • Troubleshoot and resolve database/application issues in a timely manner, tuning performance at the DB and SQL levels

Requirements

Essential:

  • Minimum 5 years’ experience as database administrator (DBA) for large enterprise level clusters
  • Minimum 3 years’ experience as DBA supporting Big Data technologies (e.g., Hadoop, Spark, HBase, HDFS, Kafka, Zookeeper, MirrorMaker, Impala, Yarn), different data file formats, and NoSQL engines
  • Experience in DBA production support on at least one of the following DBMS platforms: MongoDB, ArangoDB, or MarkLogic
  • Expert communication, facilitation, and collaboration skills
  • Ability to present, explain, and provide advice to partners, as a subject matter expert within the Hadoop and NoSQL space
  • Experience in security configuration for Hadoop clusters using Kerberos or Sophia
  • Competency in conceptualization, foresight, enterprise perspective, consensus building, technical problem-solving skills 
  • The ability to understand and adhere to firm incident, change, and problem management processes
  • Strong skills in project management methodologies like Agile, Kanban, and DevOps
  • Experience with ETL processes
  • Experience in infrastructure architecture and engineering experience, including functional and technical requirements gathering, and solution development
  • Bachelor's degree in computer science, computer engineering, or a related field, or the equivalent combination of education and related experience

Advantage to those who also have:

  • Scripting skills in Python, Perl, Java, KornShell, and AWK
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Big Data Jobs

Tags: Agile Architecture Big Data Computer Science DevOps Engineering ETL Hadoop HBase HDFS Java Kafka Kanban MongoDB NoSQL Open Source Perl Python Research Security Spark SQL

Region: Middle East
Country: Israel

More jobs like this