Fabric developer

Telangana, India

Zensar

Zensar is a global organization which conceptualizes, builds, and manages digital products through experience design, data engineering, and advanced analytics for over 200 leading companies. Our solutions leverage industry-leading platforms to...

View all jobs at Zensar

Apply now Apply later

 

 


 

Job Title:  Developer

Experience: 3-7 years

No. of positions: 1

     

 

 

  • Responsible for understanding the requirements and perform data analysis.
  • Responsible for setup of Microsoft fabric and its components
  • Building secure, scalable solutions across the Microsoft Fabric platform.
  • Create and manage Lakehouses.
  • Implement Data Factory processes for scalable ETL and data integration.
  • Design, implement and manage comprehensive Data Warehousing solutions for analytics using fabric
  • Creating and scheduling data pipelines using Azure data factory
  • Building robust data solutions using Microsoft data engineering tools.
  • Create and manage Power BI reports and semantic models
  • Write and optimize complex SQL queries to extract and analyze data, ensuring data processing and accurate reporting.
  • Work closely with customers, business analysts and technology & project team to understand business requirements, drive the analysis and design of quality technical solutions that are aligned with business and technology strategies and comply with the organization's architectural standards.
  • Understand and follow-up through change management procedures to implement project deliverables.
  • Coordinating with support groups to get the issues resolved in a quick turnaround time.

 

     

 

 


 

Mandatory

  • Bachelor’s degree in computer science or similar field or equivalent work experience.
  • 3+ years of experience working in Microsoft Fabric.
  • Expertise in working with OneLake and Lakehouses.
  • Strong understanding of Power BI reports and semantic model using Fabric
  • Proven record of building ETL and data solutions using Azure data factory.
  • Strong understanding of data warehousing concepts and ETL processes.
  • Hand on experience of building data warehouses in fabric.
  • Strong skills in Python and PySpark
  • Practical experience of implementing spark in fabric, scheduling spark jobs, writing spark SQL queries.
  • Knowledge of real time analytics in fabric
  • Experience of utilizing Data Activator for effective data asset management and analytics.
  • Ability to flex and adapt to different tools and technologies.
  • Strong learning attitude.
  • Good written and verbal communication skills.
  • Demonstrated experience of working in a team spread across multiple locations.

 

 

Preferable

  • Knowledge of AWS services
  • Knowledge of snowflake

 

Location: NOIDA

Timings: 2:00 PM to 10:30 PM

Cab Facility provided: Yes

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: AWS Azure Computer Science Data analysis Data pipelines Data Warehousing Engineering ETL Pipelines Power BI PySpark Python Snowflake Spark SQL

Region: Asia/Pacific
Country: India

More jobs like this