Senior Data Engineer

Pune

SiteMinder

Grow hotel revenue with SiteMinder software for independents and multi-property groups: channel manager, booking engine, PMS integrations, demand plus and more.

View all jobs at SiteMinder

Apply now Apply later

At SiteMinder we believe the individual contributions of our employees are what drive our success. That’s why we hire and encourage diverse teams that include and respect a variety of voices, identities, backgrounds, experiences and perspectives. Our diverse and inclusive culture enables our employees to bring their unique selves to work and be proud of doing so. It’s in our differences that we will keep revolutionising the way for our customers. We are better together!

What We Do…

We’re people who love technology but know that hoteliers just want things to be simple. So since 2006 we’ve been constantly innovating our world-leading hotel commerce platform to help accommodation owners find and book more guests online - quickly and simply.

 

We’ve helped everyone from boutique hotels to big chains, enabling travellers to book igloos, cabins, castles, holiday parks, campsites, pubs, resorts, Airbnbs, and everything in between.

 

And today, we’re the world’s leading open hotel commerce platform, supporting 47,000 hotels in 150 countries - with over 125 million reservations processed by SiteMinder’s technology every year.

Senior Data Engineer

Roles and Responsibilities 

The Enterprise Data Management and Analytics (EDMA) team at Siteminder is the central BI and Analytics team catering to all analytical needs across the company. 

As a Senior Data Engineer, you will be part of the EDMA team responsible for all Data Transformation functions along with Data Pipelines and underlying Data Infrastructure powering it.

To succeed in this role you must have :

  • 5+ years of experience as Data Engineer with at least 2-3 end to end Data implementations executed.

  • Solid experience in implementing Data Transformations using DBT on Snowflake/ Redshift. (Must have)

  • Solid experience in implementing scalable Data Pipelines with Airflow (Must have)

  • Experience in building configuration based metadata driven ingestion/ transformation frameworks.

  • Must have strong technical ability to understand, design, build, test and debug complex code in Python, Spark and SQL. (Must have)

  • Experience with CI/CD based development lifecycle using tools such as Git, Buildkite, Jenkins, Team city or others.

  • Ability to work as an individual contributor with minimal support from other peers or leads. (Must have)

  • Ability to communicate with business stakeholders and understand the requirements and build solutions to solve those business problems. 

  • Need Engineers who understand the business and then build solutions to solve business problems than just being pure Technical executors.

Bonus Points if you have :

  • Experience in implementing Data Lakehouse Architecture in Snowflake or AWS tech stack.

  • Experience integrating data from Salesforce, Marketo, Intercom and Heap.

  • Experience with native Snowflake platform features.

Does this job sound like you? If yes, we'd love for you to be part of our team! Please send a copy of your resume and our Talent Acquisition team will be in touch.

When you apply, please tell us the pronouns you use and any adjustments you may need during the interview process. We encourage people from underrepresented groups to apply.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS CI/CD Data management Data pipelines dbt Git Jenkins Marketo Pipelines Python Redshift Salesforce Snowflake Spark SQL

Regions: Remote/Anywhere Asia/Pacific
Country: India

More jobs like this