Data Analytics Engineer

Pune I

Apply now Apply later

Job Title

Data Analytics Engineer

Job Description

Job Title: Analytics Engineer

Work Location: Pune, Kharadi

Relevant experience required (in years): 5 to 7 years' experience is required.

Position Summary:

Job Description

Are you a data enthusiast, passionate about converting raw information into valuable and actionable insights? Do you thrive on collaborating with technical and business stakeholders to shape the future of data-driven solutions? If your answer to these questions is “yes”, then look no further than Vanderlande!

We’ll give you the opportunity to:

  • Work in a highly motivated, global team of Data & Analytics enthusiasts.
  • Play a crucial role in advancing the maturity of the Data Platform, contributing to delivery of analytical products and services offered by our department to the Vanderlande organization.
  • Work within a cloud based architecture based on Data Mesh principle.

Your responsibilities
As an Analytics Engineer at Vanderlande you'll be at the forefront of driving data-driven solutions and shaping the future of our organization by working on the data platform. Your key responsibilities will involve translating business needs into functional requirements, designing and developing data products, pipelines, and reports, and analyzing data to solve use cases resulting in optimized business processes and fact-based decision-making.

As being part of a cross functional full-stack team, together with your colleagues you are responsible for the creation and delivery of an end-to-end solution to our business stakeholders. In this role, you are a tech-savvy, action-oriented, and collaborative colleague who can wear multiple hats - part Data Engineer, - part Data Analyst and even though you have an area of expertise, you can fulfil each of those roles up to a certain point.

On a day-to-day basis, you will focus on creating and maintaining data products, data pipelines using Python, PySpark and SQL, and dashboards using tools like Qlik, or Power BI. Working in an Agile environment, proactively contributes to scrum events ensuring seamless coordination with the team and swift adaptability to changing priorities. They thrive in a fast-paced, iterative development environment, where constant feedback and continuous improvement are key.

In this role, you:

  • Translate business needs into functional requirements providing essential information on business use cases.
  • Translate functional requirements into thorough and feasible data products, analytics solutions and dynamic dashboards;
  • Utilize Python, SQL PySpark,, or R for data retrieval and manipulation.
  • Develop, test, and maintain data products, pipelines by the use of the Azure stack and Databricks, ensuring data reliability and quality.
  • Design and implement architectures for efficient data extraction and transformation.
  • Work on creating and maintaining landing zones in the data platform.
  • Actively participate in and contribute to Continuous Integration and Continuous Deployment (CI/CD) practices, ensuring smooth and efficient development and deployment processes within the data platform
  • Integrate data pipelines and reports into testing frameworks, allowing for rigorous performance testing and validation to ensure seamless performance.
  • Monitor and maintain data pipeline stability, offering support when required.
  • Analyze, interpret, and visualize data to drive business process optimization and fact-based decision-making.
  • Create, deploy, and maintain interactive dashboards and visualizations using Qlik or Power BI.
  • Perform a comprehensive analysis and proactively implement solutions to assess and enhance data quality and data reliability.
  • Actively participate in delivery execution on the Data Platform
  • You are eager to improve yourself and strive for continuous enhancement of processes and development within the data platform.
  • Stay updated with the latest developments in the analytics field and share knowledge with the team.

Your department
You will be part of the Finance Technology department within the Finance Transformation Office which is in Veghel (Netherlands) and Pune (India). The FTO team is part of the finance organization and provides functional application ownership, support, and advice on process improvements in the finance function, data insights and beyond. A flexible but critical attitude, in-depth process knowledge and a feel for the business are its key success factors. Within Finance Transformation Office you will be part of a team of data & analysis, all with a focus on improvement projects, automation, and support.

Your qualifications and skills

If you’re an experienced, enthusiastic and versatile Analytics Engineer, you will bring:

  • Tech savvy
  • Curious
  • Action oriented
  • Collaborates
  • Courage
  • Demonstrates self-awareness
  • Self-development
  • Participate in continuous improvements and UAT
  • Resourcefulness
  • Values differences
  • Situational adaptability
  • Demonstrates self-awareness

Other Skills:

  • Excellent communication
  • Outstanding analytical skills
  • Knowledge of Agile methodologies (Scrum/SAFE)
  • Knowledge of financial administration
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Architecture Azure CI/CD Data Analytics Databricks Data pipelines Data quality Finance Pipelines Power BI PySpark Python Qlik R Scrum SQL Swift Testing

Perks/benefits: Flex hours Team events

Region: Asia/Pacific
Country: India

More jobs like this