PySpark Data Engineer

Jaipur, India

Marktine Technology Solutions Pvt Ltd

View all jobs at Marktine Technology Solutions Pvt Ltd

Apply now Apply later

Roles and Responsibilities:

  • Responsible for developing and maintaining applications with PySpark
  • Contribute to the overall design and architecture of the application developed and deployed.
  • Performance Tuning wrt to executor sizing and other environmental parameters, code optimization, partitions tuning, etc
  • Interact with business users to understand requirements and troubleshoot issues.
  • Implement Projects based on functional specifications.

Must-Have Skills:

  • Relevant Experience: 3-6 Years 
  • SQL - Mandatory
  • Python - Mandatory
  • SparkSQL - Mandatory
  • PySpark - Mandatory
  • Hive - Mandatory
  • HDFS and Spark - Mandatory
  • Scala - Advantage
  • Apache Airflow - Advantage  


Requirements

3-6 Years of Experience
Must Have: PySpark/Spark, Python, SQL, Knowledge on Hadoop ecosystem
Good to have: Airflow, Scala
Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture Hadoop HDFS PySpark Python Scala Spark SQL

Region: Asia/Pacific
Country: India

More jobs like this