Data Ops Engineer

Noida, Uttar Pradesh

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

ShyftLabs

Transform your business with Shyftlabs' cutting-edge AI solutions, digital transformation services, and expert technology consulting. Drive innovation and growth with our proven expertise.

View all jobs at ShyftLabs

Apply now Apply later

Position OverviewWe are looking for DevOps & Data Engineer who will operate within the Data Engineering department and will also work to help the data engineering development, operations and testing teams to better communicate and stay informed of each other’s work progress or updates. They also work to better understand the wants and needs of customers and build software and tools to reduce user errors and improve the customer’s overall experience.
ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries, by focusing on creating value through innovation.

Job Responsibilities

  • Work with stakeholders to develop end-to-end cloud-based solutions with a focus on applications and data.
  • Collaborate with BI/BA analysts, data scientists, data engineers, DevOps cloud engineers, product managers, and other stakeholders across the organization.
  • Ensure delivery of reliable software and data pipelines using data engineering best practices, including secure automation, version control, CI/CD, and proper testing.
  • Own the product and influence strategy by helping define the next wave of data insights and system architecture.
  • Guide teams in designing, building, testing, and deploying changes to existing software.
  • Ensure system security while updating code and improving performance.
  • Manage simultaneous updates to sections of data engineering code and other program components.
  • Use data to discover opportunities for automation.
  • Align with the latest data trends and adopt ways to simplify data insights.
  • Be an essential part of the analytics and data insights team, contributing to the technological and architectural vision.

Basic Qualifications

  • Minimum 3+ years of hands-on experience in data integration, DevOps, engineering, and analytics.
  • Degree in Science, Technology, Engineering, or Mathematics (STEM) or a related discipline.
  • Proficiency in SQL, Python, and distributed source control systems such as GIT.
  • Experience working in an Agile-Scrum environment.
  • Strong scripting skills in languages such as Ruby, PHP, Perl, and Python.
  • Excellent interpersonal, management, and decision-making skills.
  • Experience with ETL pipelines and workflow management tools like Airflow.
  • Strong understanding of dimensional modeling and data warehousing methodologies.
  • Ability to identify ways to improve data quality and reliability.
  • Commitment to teamwork and strong business communication skills.

Preferred Qualifications

  • Experience within the retail industry.
  • Passion for working with large data sets and deriving insights.
  • Familiarity with identifying tasks suitable for automation using data.
  • Awareness and alignment with the latest trends in data analytics and cloud-based data solutions.
We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0

Tags: Agile Airflow Architecture CI/CD Data Analytics DataOps Data pipelines Data quality Data Warehousing DevOps Engineering ETL Git Mathematics Perl PHP Pipelines Python Ruby Scrum Security SQL STEM Testing

Perks/benefits: Career development Competitive pay

Region: Asia/Pacific
Country: India

More jobs like this