Devops Engineer - Big Data

Remote

Look4IT

Świadczymy usługi rekrutacyjne i outsourcingowe wbranży IT dostosowane do potrzeb Twojej firmy. Poznaj nasze możliwości.

View all jobs at Look4IT

Apply now Apply later

This is a remote position.

For a client specializing in digital, custom, and tailor-made products, we're looking for a DevOps Engineer - BigData. The company provides cloud-based software, offering comprehensive solutions from business analysis to front-end/back-end architecture design and implementation, ensuring the highest product quality. They operate with a startup mentality, transparency, and efficiency, employing a simple and automated approach. As a leader in technology, they explore, transform, and optimize digital product design. They offer numerous opportunities to gain experience and share knowledge within distributed full-stack teams, working across various industries such as gaming, proptech, ad tech, fintech, legal tech, and ML/AI, serving startups, scale-ups, and Fortune 500 companies worldwide.
  You will work on a project for one of the largest game studios, known for popular MOBA and FPS series. As part of the Data as a Service team, you will gather and utilize data to enhance player experiences. Your primary responsibilities will include building data solutions capable of processing petabytes of information, protecting player privacy, using Big Data and AWS tools to organize and optimize data warehouses, developing a platform for real-time data ingestion and analysis, and supporting product teams in improving service efficiency. Your experience with large datasets and global systems will be crucial in developing effective solutions.
  RESPONSIBILITIES:
 
  • Be accessible for meetings in the afternoon/evening to align with the team located on the US West Coast.
  • Participate in the on-call rotation to provide 3rd line support for live systems.
  • Design and implement production infrastructure using AWS and Terraform.
  • Develop and manage deployment pipelines and CI/CD processes.
  • Create design documentation, implementation plans, and select suitable tools.
  • Collaborate with your pod (up to 5 engineers) and other cross-functional teams.
  • Establish and maintain cloud environments and services on AWS.
  • Engage in hands-on work with live production systems.
  • Monitor production infrastructure using DataDog.
  • Communicate with project stakeholders.
  • Participate in code reviews.
  • Automate various processes.


Requirements

  • Minimum of 5 years of commercial work experience.
  • Bachelor’s or higher degree in Computer Science, Software Engineering, or a related field.
  • Proficiency with Infrastructure as Code tools such as Terraform and Ansible.
  • Deep knowledge and experience with AWS and networking (e.g., EC2, ECS, EKS, Security Groups, VPC, Auto Scaling Groups, Route 53, RDS).
  • Experience with CI/CD tools like Jenkins and Docker.
  • Strong communication and teamwork skills. 
  • Advanced level of English C1/C2.
  NICE TO HAVE:
 
  • Familiarity with Spark.
  • Experience with Data Engineering and Data Pipelines.


Benefits

  • Multisport Card.
  • Life and private medical insirance.
  • AWS certification path.


Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0

Tags: Ansible Architecture AWS Big Data CI/CD Computer Science Data pipelines DevOps Docker EC2 ECS Engineering FinTech Jenkins Machine Learning Pipelines Privacy Security Spark Terraform

Perks/benefits: Health care Startup environment

Region: Remote/Anywhere

More jobs like this