Associate Tech Architect -Snowflake

IN KL Trivandrum

Apply now Apply later

While technology is the heart of our business, a global and diverse culture is the heart of our success. We love our people and we take pride in catering them to a culture built on transparency, diversity, integrity, learning and growth.


If working in an environment that encourages you to innovate and excel, not just in professional but personal life, interests you- you would enjoy your career with Quantiphi!

Role & Responsibilities:

  • Design and implement data solutions (ETL pipelines, Processing and Transformation logic) using Snowflake as a key platform.

  • Design Virtual warehouses which are optimized for performance and cost (auto scaling, idle time etc)

  • Write SQL , stored procedures to implement business logic, other critical requirements like encryption, data masking etc

  • Implement solutions  with features like Snowpipe, tasks, streams,dynamic tables etc

  • Design the solutions by leveraging key concepts around the architecture of Snowflake, its data distribution and partitioning mechanism, data caching etc.

  • Identify queries which take more time , look at the logs etc and provide solutions to optimize those queries

Skills expectation:

  • Must have:

    • 3+ years’ experience of Hands-on in data structures, distributed systems, Hadoop and Spark, SQL and NoSQL Databases 

    • Experience of working on Datawarehouse/DataLake projects using Snowflake (must) and either of Redshift/Big Query/Synapse/Databricks/Iceberg.

    • Designing and development of ETL pipeline (AWS/Azure/GCP)

    • Strong skills in writing SQL queries, stored procedures and functions in any of the SQL supported databases

    • Strong software development skills in at least one of:  Python,Pyspark or Scala.

    • Any of Airflow, Oozie, Apache NiFi, Google DataFlow  Message/Event Solutions  or other job orchestration services.

    • Experience in developing Big Data solutions (migration, storage, processing)

    • Experience in building and supporting large-scale systems in a production environment    

    • Cloud Platforms – AWS/GCP/Azure  Big Data Distributions 

    • Any of Apache Hadoop/CDH/HDP/AWS EMR/Google DataProc/Databricks/HD-Insights Distributed processing Frameworks.

  • Good to have: 

    • Experience of working on Datawarehouse/DataLake projects using either of Redshift/Big Query/Synapse/Snowflake/Databricks/Iceberg

    • Tableau, Looker, Power BI

    • Any of Kafka, Kinesis, Cloud pub-sub  Container Orchestration (Good to have)

    • Requirement gathering and understanding of the problem statement

    • End-to-end ownership of the entire delivery of the project

    • Designing and documentation of the solution

    • Team management and mentoring

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Architecture Jobs

Tags: Airflow Architecture AWS Azure Big Data BigQuery Databricks Dataflow Dataproc Distributed Systems ETL Excel GCP Hadoop Kafka Kinesis Looker NiFi NoSQL Oozie Pipelines Power BI PySpark Python Redshift Scala Snowflake Spark SQL Tableau

Perks/benefits: Career development

Region: Asia/Pacific
Country: India

More jobs like this