Data Engineer

Pune I, India

Apply now Apply later

Job Title

Data Engineer

Job Description

Data Engineer at Operational Awareness - Warehouse team within Digital Service Platform (DSP)

Job Location: PUNE

Vanderlande provides baggage handling systems for 600 airports around the globe, capable of moving over 4 billion pieces of baggage around the world per year. For the parcel market our systems handle 52 million parcels per day. All these systems generate data. Do you see a challenge in building data-driven services for our customers using that data? Do you want to contribute to the fast-growing Vanderlande Technology Department on its journey to become more data driven? If so, then join our Digital Service Platform stream!

Your Position

As a data engineer, you will be responsible for delivering data intelligence solutions to our customers all around the globe, based on an innovative product, which provides insights into the performance of their material handling systems. You will be working on implementing and deploying the product as well as designing solutions to fit it to our customer needs. You will work together with an energetic and multidisciplinary team to build end-to-end data ingestion pipelines and implement and deploy dashboards.

Your tasks and responsibilities

  • You will design and implement data & dashboarding solutions to maximize customer value.
  • You will be responsible for communicating with the end customers directly and addressing their needs as good as possible.
  • You will deploy and automate the data pipelines and dashboards to enable further project implementation.
  • You embrace working in an international, diverse team, with an open and respectful atmosphere.
  • You leverage data by making it available for other teams within our department as well to enable our platform vision.
  • Communicate and work closely with other groups within Vanderlande and the project team.
  • You enjoy an independent and self-reliant way of working with a proactive style of communication to take ownership to provide the best possible solution.
  • You will be part of an agile team that encourages you to speak up freely about improvements, concerns, and blockages. As part of Scrum methodology, you will independently create stories and participate in the refinement process.
  • You collect feedback and always search for opportunities to improve the existing standardized product.
  • Execute projects from conception through client handover with a positive contribution on technical performance and the organization.
  • You will take the lead in the communication with different stakeholders that are involved in the projects that are being deployed.

Your profile

  • Strong communication skills in English.
  • Skilled at breaking down large problems into smaller, manageable parts.
  • Experience in guiding, motivating and training engineers.
  • Minimum 5+ yrs exp with at least 3 years' experience with building and deploying complex data pipelines and data solutions.
  • Bachelor’s or Master’s degree in Computer Science, IT, or equivalent.
  • Experience with NoSQL and unstructured data; event processing tools like Splunk or the ELK stack.
  • Minimum 3 years’ experience with visualization software, preferably Splunk (or else PowerBI, Tableau, or similar).
  • Hands-on experience with data modeling.
  • Hands-on experience with programming in Python, and proficiency in Test-Driven Development using pytest.
  • Experience in data engineering using DevOps principles.
  • Experience with data schemas (e.g. JSON/XML/Avro).
  • Experience in deploying services as containers (e.g. Docker, Podman).
  • Experience in working with cloud services (preferably with Azure).
  • Experience with streaming and/or batch storage (e.g. Kafka, Oracle) is a plus.
  • Experience in creating APIs is a plus.
  • Experience with Databricks is a plus.
  • Experience in data quality management and monitoring is a plus.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile APIs Avro Azure Computer Science Databricks Data pipelines Data quality DevOps Docker ELK Engineering JSON Kafka NoSQL Oracle Pipelines Power BI Python Scrum Splunk Streaming Tableau TDD Unstructured data XML

Region: Asia/Pacific
Country: India

More jobs like this