Snowflake Data Engineer | Remote-Friendly

Pune, Maharashtra, India - Remote

Velotio

Velotio Technologies is a leading product engineering & digital solutions company for innovative startups and enterprises. Velotio has worked with over 90 global customers, including NASDAQ-listed enterprises and unicorn startups. We specialize...

View all jobs at Velotio

Velotio Technologies is a product engineering company working with innovative startups and enterprises. We are a certified Great Place to Work® and recognized as one of the best companies to work for in India. We have provided full-stack product development for 110+ startups across the globe building products in the cloud-native, data engineering, B2B SaaS, IoT & Machine Learning space. Our team of 325+ elite software engineers solves hard technical problems while transforming customer ideas into successful products.

Requirements

  • This Snowflake data engineer will be responsible for architecting and implementing very large scale data intelligence solutions around Snowflake Data Warehouse.
  • A solid experience and understanding of architecting, designing and operationalization of large scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.
  • Need to have professional knowledge of AWS Redshift.
  • Experience in developing ETL pipelines in and out of data warehouse using combination of Python and Snowflakes Snow SQL.
  • Writing SQL queries against Snowflake.
  • Working knowledge of AWS Redshift.
  • Provide production support for Data Warehouse issues such data load problems, transformation translation problems.
  • Translate requirements for BI and Reporting to Database design and reporting design
  • Understanding data transformation and translation requirements and which tools to leverage to get the job done.
  • Understanding data pipelines and modern ways of automating data pipeline using cloud based.
  • Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.

    Desired Skills & Experience
  • Experience of designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse.
  • 5+ years of hands on experience with building productionized data ingestion and processing pipelines using Java, Spark, Scala, Python.
  • Experience in designing and implementing production grade data warehousing solutions on large scale data technologies.
  • Proficiency in Data-modelling is a must.
  • Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies.
  • Excellent presentation and communication skills, both written and verbal.
  • Ability to problem solve and architect in an environment with unclear requirements.

Benefits

Our Culture:

  • We have an autonomous and empowered work culture encouraging individuals to take ownership and grow quickly
  • Flat hierarchy with fast decision making and a startup-oriented “get things done” culture
  • A strong, fun & positive environment with regular celebrations of our success. We pride ourselves in creating an inclusive, diverse & authentic environment

Note: Currently, all interviews and onboarding processes at Velotio are being carried out remotely through virtual meetings.


* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  10  1  0
Category: Engineering Jobs

Tags: AWS Data pipelines Data warehouse Data Warehousing Engineering ETL Java Machine Learning Pipelines Python Redshift Scala Snowflake Spark SQL Testing

Perks/benefits: Career development Flat hierarchy Startup environment

Regions: Remote/Anywhere Asia/Pacific
Country: India

More jobs like this