Data Engineer

India

Twinings

Shop over 200 teas from around the world. Find gifts for all occasions, teaware, accessories and confectionery, as well as recipes, expert advice and more.

View all jobs at Twinings

Apply now Apply later

Data Engineer

Application Deadline: 7 November 2024

Department: BizTech

Employment Type: Permanent - Full Time

Location: India


Description

Great People Work Here.

Are you searching for a career with bags of variety, in an environment that celebrates differences and empowers collaboration, which values individuals and will encourage you to make an impact? Do you want the freedom to explore, and the opportunities to find new ways and to innovate? If so,  TwiningsOvo delivers.  

We’re looking for people who don’t just come here to get the job done, but who have a real passion for the brand and a desire to do the best job they can. In return, we offer an inspiring package of employee benefits - to show just how much we value you. This role will offer you the scope for growth and the tools to aim high.

The Data Engineer performs a vital role in building reliable, repeatable data pipelines and aids in modelling data correctly for building end user dashboards/reports. Acting as an expert in end to end data movement and modelling, they use their experience of building expandable, repeatable and forward-looking technical artifacts to propose & implement Data solutions in TwiningsOvo. They will also need to liaison with our colleagues users directly to understand/elicit analytics requirements, propose solutions and then architect and build with the rest of the Data Analytics team.

Key Responsibilities

  • The Data Engineer will be responsible for the delivery of Data Collection and Transformation workstream within both larger, strategic projects as well as smaller, tactical initiatives.
  • Inputs into the Data Modelling workstream, building towards the Information Model as per the overall Data Strategy.
  • Create and own the Source to Target document and the ETL technical definition document.
  • Work closely with the Data Architect, Data Visualization engineer and the D&A Product Owner to maintain Source to Target mappings, ETL technical definition documents and Report Technical Definition document. 
  • Assist D&A Product Owner in Requirements phase to ensure proposed solutions/outputs are aligned to overall Data Strategy.
  • Drive ideation and prove solutions & technologies with well thought out Proof of Concepts with clear objectives and timelines. Conversant with Agile methodologies like Sprints and Scrums.
  • Provide training for internal resources on concepts around data storage, modelling and data access.
  • Drive communication and dialogue between business and tech teams to agree outputs.
  • Build an understanding of future needs, direction, ambition and risks for each BU and functional area and communicate with technical teams to inform architecture.
  • Participate in sprint planning sessions with the project team, aligning dependencies and agreed timelines.
  • Build and maintain an understanding of the analytics technology landscape, researching emerging trends and how to exploit for our business benefit and value.
  • Input into creatin compelling business cases
    • Build an understanding of Twinings Ovo value drivers to help clearly articulate benefits for projects, i.e., value proposition.
    • Helps business partners/product owners to communicate the solution value to key stakeholders.
  • Prototype and co-design for innovative data & analytics solutions. Collaborate with Product Owner to partner with stakeholders across the business and technology teams- while ensuring solutions are consistent with the overall business and IT strategy. This may include the evaluation of 3rd party solutions.
  • Identify opportunities for business-led delivery. Provide consultative support on business-led technology projects. 
  • Deliver solutions that support business objectives, obtaining input from the Data & Analytics Architecture team around architecture and capabilities. Input into the future roadmap from both delivery and capability perspective. Support TwiningOvo’s data-driven journey. 
  • Identify solution interdependencies and commonalities across the organisation, and opportunities for creating reusable Data services. Guide solution options and business partner decision making towards target architecture. 
  • Support the analysis and evaluation of possible impact of changes to the existing application landscape and organisation. (Key Users Network and Respective Business Owners organizations maintained)
  • Support the Project Managers in identifying project risks, impacts and mitigating actions
  • Support and actively engage in end to end delivery of Analytics projects from design, build, test and deployment, and transition into BAU support.
  • Support test activities by ensuring acceptance criteria are well documented and have the appropriate level of coverage, taking responsibility for the design and execution of test cases where appropriate. 
  • Work with the wider project teams to enable the solutions to be imbedded within the business through the development of communications and training materials. Ensure system and process documentation is effective and up to date. 
  • Collaborate with the Project Managers and business stakeholders during the warranty phase by ensuring that any defects or enhancement requests are well defined and prioritised appropriately. 
  • Ensure that all working practices comply with Health and Safety legislation and Twinings policies
  • Ability to work independently. 
  • Act as a role model for Twinings Ovo values (champion of BizTech culture, nurture and develop capabilities (Business and Technical)). 

Skills, Knowledge and Expertise

  • 5 to 10 years experience in a data engineer role 
  • Experience and exposure with Cloud First ETL tools such as Azure Data Factory, Matillion, Informatica Cloud etc. 
  • Experience with Talend ETL would be beneficial.
  • Expertise working with Analytics databases like Snowflake, MS Synapse etc.
  • Data Modelling experience, including creating star/snowflake schemas based on requirements.
  • Database skills, including SQL Scripting, Stored Procedures etc.
  • Able to model data to perform reporting and consumption by front end tools. PowerBI experience would be beneficial. 
  • Experience working in a cloud based environment, MS Azure or AWS preferred.
  • Understanding of Data Lakes, including partitioning, bucketing and salting of data in Data Lakes. 
  • Understanding of various data lake storage formats, including Parquet, ORC, Avro etc.
  • Intermediate understanding of Big Data technologies like Spark, Databricks etc., including Data Lake and Data Warehousing methodologies.
  • Beginner/Intermediate skills with a programming language, preferably Python.


Benefits

  • Monthly phone bill maximum reimbursement limit is Rs.3000. 
  • Annual check-up for employee and spouse including Doctor consultation - reimbursement up to INR 15,000.
  • Medical Insurance 5 lakhs Flat Coverage. 
  • PF and Gratuity. 
  • Long Service Policy.
  • Life Term Policy,
  • Monthly Broadband Bill Reimbursement - Rs.2000 or on actual whichever is lower.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Architecture Avro AWS Azure Big Data Data Analytics Databricks Data pipelines Data strategy Data visualization Data Warehousing ETL Informatica Matillion Parquet Pipelines Power BI Python Snowflake Spark SQL Talend

Perks/benefits: Career development Flat hierarchy Health care Team events

Regions: Remote/Anywhere Asia/Pacific
Country: India

More jobs like this