Senior Data Engineer

Noida Office

Clearwater Analytics

Clearwater Analytics is the leading provider of investment accounting software for reporting and reconciliation services for institutional investors.

View all jobs at Clearwater Analytics

Apply now Apply later

Job Summary:

As the hands-on Data Engineer for Clearwater Analytics, you will play a crucial role in

designs, develops, and maintains data systems and architectures to collect, store, process, and analyse large volumes of data. You will be building data pipelines, optimize data models, and ensure data quality and security. You will be collaborating with cross-functional teams to meet business objectives and stay updated with emerging technologies and industry best practices.

Responsibilities and Duties:

  • Extensive experience with Snowflake, including proficiency in Snow SQL CLI, Snowpipe, creating custom functions, developing Snowflake stored procedures, schema modeling, and performance tuning.
  • In-depth expertise in Snowflake data modeling and ELT processes using Snowflake SQL, as well as implementing complex stored procedures and leveraging Snowflake Task Orchestration for advanced data workflows.
  • Strong background in DBT CLI, DBT Cloud, and GitHub version control, with the ability to design and develop complex SQL processes and ELT pipelines.
  • Take a hands-on approach in designing, developing, and supporting low-latency data pipelines, prioritizing data quality, accuracy, reliability, and efficiency
  • Advance SQL knowledge and hands-on experience in complex query writing using Analytical functions, Troubleshooting, problem-solving, and performance tuning of SQL queries accessing data warehouse as well as Strong knowledge of stored procedures.
  • Collaborate closely with cross-functional teams including, Enterprise Architects, Business Analysts, Product Owners, Solution Architects actively engaging in gathering comprehensive business requirements and translate these requirements into scalable data cloud and Enterprise Data Warehouse (EDW) solutions that precisely align with organizational needs
  • Play a hands-on role in conducting data modeling, ETL (Extract, Transform, Load) development, and data integration processes across all Snowflake environments.
  • Develop and implement comprehensive data governance policies and procedures to fortify the accuracy, security, and compliance of Snowflake data assets across all environments.
  • Capable of independently conceptualizing and developing innovative ETL and reporting solutions, driving them through to successful completion.
  • Create comprehensive documentation for database objects and structures to ensure clarity and consistency.
  • Troubleshoot and resolve production support issues post-deployment, providing effective solutions as needed.
  • Devise and sustain comprehensive data dictionaries, metadata repositories, and documentation to bolster governance and facilitate usage across all Snowflake environments.
  • Remain abreast of the latest industry trends and best practices, actively sharing knowledge and encouraging the team to continuously enhance their skills.
  • Continuously monitor the performance and usage metrics of Snowflake database and Enterprise Data Warehouse (EDW), conducting frequent performance reviews and implementing targeted optimization efforts

Skills Required:

  • Familiarity with big data, data warehouse architecture and design principles
  • Strong understanding of database management systems, data modeling techniques, data profiling and data cleansing techniques
  • Expertise in Snowflake architecture, administration, and performance tuning.
  • Experience with Snowflake security configurations and access controls.
  • Knowledge of Snowflake's data sharing and replication features.
  • Proficiency in SQL for data querying, manipulation, and analysis.
  • Experience with ETL (Extract, Transform, Load) tools and processes.
  • Ability to translate business requirements into scalable EDW solutions.
  • Streaming Technologies like AWS Kinesis

Qualifications:

  • Bachelor's degree in Computer Science, Information Systems, or a related field. 
  • 4-10 years of hands-on experience in data warehousing, ETL development, and data modeling, with a strong track record of designing and implementing scalable Enterprise Data Warehouse (EDW) solutions.
  • 3+ years of extensive hands-on experience with Snowflake, demonstrating expertise in leveraging its capabilities.
  • Proficiency in SQL and deep knowledge of various database management systems (e.g., Snowflake, Azure, Redshift, Teradata, Oracle, SQL Server).
  • Experience utilizing ETL tools and technologies such as DBT, Informatica ,SSIS, Talend
  • Expertise in data modeling techniques, with a focus on dimensional modeling and star schema design.
  • Familiarity with data governance principles and adeptness in implementing security best practices.
  • Excellent problem-solving and troubleshooting abilities, coupled with a proven track record of diagnosing and resolving complex database issues.
  • Demonstrated leadership and team management skills, with the ability to lead by example and inspire others to strive for excellence.
  • Experience in the Finance industry will be a significant advantage

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture AWS Azure Big Data Computer Science Data governance Data pipelines Data quality Data warehouse Data Warehousing dbt ELT ETL Finance GitHub Informatica Kinesis Oracle Pipelines Redshift Security Snowflake SQL SSIS Streaming Talend Teradata

Region: Asia/Pacific
Country: India

More jobs like this