Azure Data Architect

Bengaluru, Karnataka, India

Weekday

At Weekday, we help companies hire engineers who are vouched by other software engineers. We are enabling engineers to earn passive income by leveraging & monetizing the unused information in their head about the best people they have worked...

View all jobs at Weekday

Apply now Apply later

This role is for one of the Weekday's clients

Salary range: Rs 1500000 - Rs 4000000 (ie INR 15-40 LPA)

Min Experience: 8 years

Location: Bengaluru, Pune, Chennai, Kolkata, Gurgaon

JobType: full-time

Requirements

Primary Roles and Responsibilities

  • Design and implement modern data warehouse solutions leveraging Databricks along with Azure and/or AWS cloud ecosystems.
  • Deliver innovative and scalable data engineering and analytics solutions.
  • Collaborate with Data Warehouse and BI teams to gather and translate ETL pipeline requirements.
  • Troubleshoot existing data pipelines by identifying root causes and implementing fixes.
  • Work closely with business stakeholders to understand reporting needs and design appropriate data models.
  • Mentor junior team members and support them in resolving technical challenges.
  • Lead technical discussions with client architects and internal teams to ensure alignment on solution design.
  • Manage and orchestrate data pipelines using tools like Apache Airflow.

Skills and Qualifications

  • Bachelor’s or Master’s degree in Computer Science or a related field.
  • Minimum 6 years of overall IT experience, with at least 3 years in data warehouse/ETL development.
  • Strong understanding of dimensional data modeling, including Star and Snowflake schemas.
  • Deep knowledge of data management best practices and governance.
  • Hands-on expertise with Databricks Data & AI platform and Delta Lake architecture.
  • Proficient in SQL, Python, and Spark (especially PySpark).
  • Practical experience with Azure or AWS cloud stack, including services for data processing and orchestration.
  • Familiarity with batch and real-time data processing using tools such as AWS Kinesis.
  • Experience in developing data transformation and ETL pipelines.
  • Exposure to streaming technologies such as Apache Kafka.
  • Working knowledge of big data tools such as Hadoop, Hive, Pig, and Impala.
  • Experience with NoSQL databases such as MongoDB, Cassandra, or Neo4J.
  • Comfortable working with structured and unstructured data, including imaging and geospatial formats.
  • Experience with DevOps tools like Terraform, CircleCI, and Git for CI/CD implementation.
  • Proficient in writing complex SQL queries, PL/SQL, Unix shell scripting, and performance tuning.
  • Databricks certification (Associate or Professional Data Engineer) is a plus.
  • Agile methodology experience is preferred.
  • Excellent verbal and written communication skills.
  • Strong analytical thinking and problem-solving skills with a high attention to detail.

Mandatory Skills

  • Python
  • PySpark
  • Spark
  • Azure or AWS Databricks
  • Azure Data Factory
  • Azure Data Lake
  • Medallion architecture
  • Workflow orchestration (e.g., Airflow)
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Architecture Jobs

Tags: Agile Airflow Architecture AWS Azure Big Data Cassandra CI/CD Computer Science Databricks Data management Data pipelines Data warehouse DevOps Engineering ETL Git Hadoop Kafka Kinesis MongoDB Neo4j NoSQL Pipelines PySpark Python Shell scripting Snowflake Spark SQL Streaming Terraform Unstructured data

Region: Asia/Pacific
Country: India

More jobs like this