Technical Lead-Data Engg

INDIA - HYDERABAD - BIRLASOFT OFFICE, IN

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Birlasoft

At Birlasoft we combine the power of domain, enterprise, and digital technologies to reimagine business potential. Surpassing expectations, breaking convention!

View all jobs at Birlasoft

Apply now Apply later

Area(s) of responsibility

out Us:      Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities. 


Job Summary
We are seeking a skilled Snowflake Developer with 8+ years of experience in designing, developing, and optimizing Snowflake data solutions. The ideal candidate will have strong expertise in Snowflake SQL, ETL/ELT pipelines, and cloud data integration. This role involves building scalable data warehouses, implementing efficient data models, and ensuring high-performance data processing in Snowflake.

Key Responsibilities

1. Snowflake Development & Optimization
•    Design and develop Snowflake databases, schemas, tables, and views following best practices.
•    Write complex SQL queries, stored procedures, and UDFs for data transformation.
•    Optimize query performance using clustering, partitioning, and materialized views.
•    Implement Snowflake features (Time Travel, Zero-Copy Cloning, Streams & Tasks).

2. Data Pipeline Development
•    Build and maintain ETL/ELT pipelines using Snowflake, Snowpark, Python, or Spark.
•    Integrate Snowflake with cloud storage (S3, Blob) and data ingestion tools (Snowpipe).
•    Develop CDC (Change Data Capture) and real-time data processing solutions.

3. Data Modeling & Warehousing
•    Design star schema, snowflake schema, and data vault models in Snowflake.
•    Implement data sharing, secure views, and dynamic data masking.
•    Ensure data quality, consistency, and governance across Snowflake environments.

4. Performance Tuning & Troubleshooting
•    Monitor and optimize Snowflake warehouse performance (scaling, caching, resource usage).
•    Troubleshoot data pipeline failures, latency issues, and query bottlenecks.
•    Work with DevOps teams to automate deployments and CI/CD pipelines.

5. Collaboration & Documentation
•    Work closely with data analysts, BI teams, and business stakeholders to deliver data solutions.
•    Document data flows, architecture, and technical specifications.
•    Mentor junior developers on Snowflake best practices.

Required Skills & Qualifications

•    8+ years in database development, data warehousing, or ETL.
•    4+ years of hands-on Snowflake development experience.
•    Strong SQL or Python skills for data processing.
•    Experience with Snowflake utilities (SnowSQL, Snowsight, Snowpark).
•    Knowledge of cloud platforms (AWS/Azure) and data integration tools (Coalesce, Airflow, DBT).
•    Certifications: SnowPro Core Certification (preferred).

Preferred Skills
•    Familiarity with data governance and metadata management.
•    Familiarity with DBT, Airflow, SSIS & IICS
•    Knowledge of CI/CD pipelines (Azure DevOps).

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0
Category: Leadership Jobs

Tags: Airflow Architecture AWS Azure CI/CD Clustering Data governance Data quality Data Warehousing dbt DevOps ELT ETL Pipelines Python Snowflake Spark SQL SSIS

Region: Asia/Pacific
Country: India

More jobs like this