Senior Lead Data Engineer

Pune, India

KONE

We are a global leader in the elevator and escalator industry. At KONE, we make people's journeys safe, convenient and reliable, in taller, smarter buildings.

View all jobs at KONE

Apply now Apply later

We are looking for a

Senior Data Engineer, Data Products (Pune)

To join the team developing the Data Foundation based data products for a KONE. Our Data Foundation based data products are key enabler in our digital transformation creating ability to develop new scalable analytics, AI and digital use cases by leveraging data across the whole KONE organization. Data products play a vital role in business value generation and driving optimized data architecture is crucial to ensure reusability of the data assets on the cloud based Data Foundation.

This is a hands-on, roll up your sleeves position that requires a passion for data, engineering and DevOps. In this role you will be doing hands on data engineering tasks from new development to resolving technical issues, maintaining and optimizing workloads. We offer a chance to work hands-on with state-of-the-art cloud technology in a global, multi-cultural work environment being located in our office in Pune. We approach multi-cloud data engineering experience, as your professional background.

We are searching for an enthusiastic person to join the team who is excited about developing own professional skills even further, learning new things and contributing to team success. An ideal candidate has a strong background in data engineering, SW engineering and data integration with modern multi cloud stack, but above all will to commit to a DevOps mindset and reach goals together. 

We are expecting you to take self-driven, proactive approach to your work, find & implement solutions, continuously looks for improvement opportunities in own area, solve problems, make decisions and share the learnings to colleagues. We want to work with people who enjoy teamwork, are not afraid to step out of their comfort zone, want to help others and share information.

To succeed in this role, following professional experience will play a key role:

  • Master's degree in either software engineering, data engineering, computer science, or a related field
  • Hands-on data engineering professional experience (> 3 years)
    • Previous hands-on professional experience in developing and maintaining data pipelines on AWS, Azure and/or Databricks. Working proficiency the the tech stack: AWS, Gitlab, Databricks for ETL, Airflow, SCL, Python, Scala, and DBT for developing ETL, jobs, AWS CDK and Terraform for IaC.
    • Hands-on development experience on lake house architecture based on Databricks and Delta Lake, multi-hop medallion architecture to divide the data lake into bronze, silver and gold layers based on the quality and reusability of the data stored there, data product publishing in Unity catalog.
    • Strong coding proficiency with multiple languages SQL and Python, additional language are bonus. Ability to write compelling code, technical documentation and visualize your technical design.
    • Fluency in industry-standard DevOps practices and tools.
    • Practical experience in working with enterprise data landscapes and data structures: Structural data, non-structural data, metadata, master data, transactional data, batch/NRT. Experience on enterprise data sources: Experience in woring with e.g. SAP ERP, Salesforce, Droduct data management (PDM), and many others.
  • Way of working professional experience
    • Passion to utilize agile development methodologies and tools (Jira, Confluence, draw.io).
    • Inbuilt cybersecurity awareness. Understanding on data privacy and compliancy regulations.
    • Ability to work in global multi-cultural team and effectively collaborate within the team.
    • Ability to self-organize, take accountability and be proactive, seek feedback, be courageous and resilient, and have excellent problem-solving skills.
    • Experience of DataOps and ITSM processes.
    • Proficiency in spoken and written English language, and strong facilitation and communication skills.

 The position is based in Pune in India.

*********** KONE DATA ENGINEER ROLE RELATED RESPONSIBILITIES ***********

Quality focus

  • Responsible for the design and implementation of data pipelines according to business/analytics needs and best practices
  • Responsible for ensuring that data pipelines are monitored and reliable
  • Responsible for fixing defects in a timely manner
  • Responsible for assembling data sets into a useful format for analysis using fit-for-purpose database technologies
  • Responsible for building services and tools to make data more accessible to all data consumers
  • Responsible for the documentation of data transformations, data models, and data flows
  • Responsible for following the KONE cybersecurity guidelines

Collaboration focus

  • Works with cross-functional analytics, business, and technology teams to deliver scalable successes
  • Handles code reviews in the team
  • Continuously looks for improvement opportunities in own area and shares them with rest of the team; explain own work and resulting conclusions both orally and in writing
  • Planning focus
  • Decomposes problems into component parts and effectively solve well-scoped problems
  • Participates and actively contributes to agile ceremonies such as daily stand-ups, sprint planning & retros
  • Participates to backlog grooming and story estimations

Accountabilities and Decisions

  • Responsible for understanding the project goals, data, methods, and their limitations
  • Responsible for taking initiative to what needs to be done without being asked
  • Responsible for seeing opportunities in solving problems within the scope of work for data engineering
  • Responsible to plan and design implementation and identify required data source within own scope
  • Responsible for following the KONE cybersecurity guidelines
  • Accountable for adequate test coverage for backlog items in own scope
  • Accountable for documenting the code in design documents and code itself · Accountable to review peer deliverables as planned
  • Accountable on following agreed best practices and guidelines
  • Accountable for defect fixing of the implementation in own scope
  • Accountable for defining the test cases
  • Accountable to provide knowledge transfer to production

At KONE, we are focused on creating an innovative and collaborative working culture where we value the contribution of each individual. Employee engagement is a key focus area for us and we encourage participation and the sharing of information and ideas. Sustainability is an integral part of our culture and the daily practice. We follow ethical business practices and we seek to develop a culture of working together where co-workers trust and respect each other and good performance is recognized. In being a great place to work, we are proud to offer a range of experiences and opportunities that will help you to achieve your career and personal goals and enable you to live a healthy and balanced life.

Read more on www.kone.com/careers

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Airflow Architecture AWS Azure Computer Science Confluence Databricks Data management DataOps Data pipelines dbt DevOps Engineering ETL GitLab Jira Pipelines Privacy Python Salesforce Scala SQL Terraform

Perks/benefits: Career development Salary bonus

Region: Asia/Pacific
Country: India

More jobs like this