Data Architect(Snowflake)-2152 (Remote)

Chennai, India

Apply now Apply later

CES has 26+ years of experience in delivering Software Product Development, Quality Engineering, and Digital Transformation Consulting Services to Global SMEs & Large Enterprises. CES has been delivering services to some of the leading Fortune 500 Companies including Automotive, AgTech, Bio Science, EdTech, FinTech, Manufacturing, Online Retailers, and Investment Banks. These are long-term relationships of more than 10 years and are nurtured by not only our commitment to timely delivery of quality services but also due to our investments and innovations in their technology roadmap. As an organization, we are in an exponential growth phase with a consistent focus on continuous improvement, process-oriented culture, and a true partnership mindset with our customers. We are looking for the right qualified and committed individuals to play an exceptional role as well as to support our accelerated growth. You can learn more about us at: http://www.cesltd.com/
Job Role: We are looking for a Snowflake Architect who will perform mission critical duties including data engineering strategy, contributing to and leading the development of the Enterprise Data and Analytics Platforms. A passionate professional who can blend the ever-evolving technology landscape of Cloud and Advanced Analytics with responsibilities to provide input on implementation and administration methodologies to management.

Responsibilities:
  • Deep understanding of relational as well as NoSQL data stores, methods, and approaches (star and snowflake, dimensional modelling).
  • Experience with data security and data access controls and design in Snowflake.
  • Proficiency in complex SQL, Unix Shell/Python Scripting, performance tuning and database optimization techniques.
  • Must have expertise in AWS, DBT Cloud and its integration with Snowflake to load/unload data.
  • Implement cloud-based Enterprise data warehouse solutions with multiple data platforms along with Snowflake to build data movement strategy.
  • Lead in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data.
  • Make necessary on-going updates to modeling principles, processes, solutions, and best practices to ensure that the Snowflake is aligned with business needs of the environment.
  • Act as Data domain expert for Snowflake in a collaborative environment to provide demonstrated understanding of data management best practices and patterns.
  • 8+ years of experience implementing data management solutions (Required).
  • Data Platform Architecture and Consulting experience (Highly Desired).
  • Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them.
  • Experience in handling semi-structured data (JSON, XML), columnar PARQUET using the VARIANT attribute in Snowflake.
  • Experience in in re-clustering of the data in Snowflake with good understanding on Micro-Partitions.
  • Experience in Migration processes to Snowflake from on-premises database environment.
  • Experience in designing and building manual or auto ingestion data pipeline using Snowpipe.
  • Experience in Cloud technologies such as AWS – S3, SQS, EC2, Lambda, Redshift, RDS
  • Design and Develop automated monitoring processes on Snowflake using combination of DBT, DBT Cloud, Python, PySpark, Bash with SnowSQL.
  • SnowSQL Experience in developing stored Procedures writing Queries to analyze and transform data.
Ideal Candidate:
  • 8+ years of experience implementing data management solutions using below technologies:
  • Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them.
  • Handling semi-structured data (JSON, XML), columnar PARQUET using the VARIANT attribute in Snowflake.
  • Re-clustering of the data in Snowflake with good understanding on Micro-Partitions.
  • Migration processes to Snowflake from on-premises database environment.
  • Designing and building manual or auto ingestion data pipeline using Snowpipe.
  • Cloud technologies such as AWS – S3, SQS, EC2, Lambda, Redshift, RDS
  • Design and Develop automated monitoring processes on Snowflake using combination of DBT, DBT Cloud, Python, PySpark, Bash with SnowSQL.
  • SnowSQL Experience in developing stored Procedures writing Queries to analyze and transform data.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Architecture Jobs

Tags: Architecture AWS Clustering Consulting Data management Data warehouse dbt EC2 Engineering ETL FinTech JSON Lambda NoSQL Parquet PySpark Python Redshift Security Snowflake SQL Unstructured data XML

Regions: Remote/Anywhere Asia/Pacific
Country: India

More jobs like this