Performance Testing Specialist – Databricks -Pune

Hyderabad, IN

Apply now Apply later

Description

Performance Testing Specialist – Databricks Pipelines

Key Responsibilities:

  • Design and execute performance testing strategies specifically for Databricks-based data pipelines.
  • Identify performance bottlenecks and provide optimization recommendations across Spark/Databricks workloads.
  • Collaborate with development and DevOps teams to integrate performance testing into CI/CD pipelines.
  • Analyze job execution metrics, cluster utilization, memory/storage usage, and latency across various stages of data pipeline processing.
  • Create and maintain performance test scripts, frameworks, and dashboards using tools like JMeter, Locust, or custom Python utilities.
  • Generate detailed performance reports and suggest tuning at the code, configuration, and platform levels.
  • Conduct root cause analysis for slow-running ETL/ELT jobs and recommend remediation steps.
  • Participate in production issue resolution related to performance and contribute to RCA documentation.


Matrix is a global, dynamic, and fast-growing leader in technical consultancy and technology services, employing over 13,000 professionals worldwide. Since its founding in 2001, Matrix has expanded through strategic acquisitions and significant ventures, cementing its position as a pioneer in the tech industry.

We specialize in developing and implementing cutting-edge technologies, software solutions, and products. Our offerings include infrastructure and consulting services, IT outsourcing, offshore solutions, training, and assimilation. Matrix also proudly represents some of the world's leading software vendors.

With extensive experience spanning both private and public sectors—such as Finance, Telecom, Healthcare, Hi-Tech, Education, Defense, and Security—Matrix serves a distinguished clientele in Israel and an ever-expanding global customer base.

Our success stems from a team of talented, creative, and dedicated professionals who are passionate about delivering innovative solutions. We prioritize attracting and nurturing top talent, recognizing that every employee’s contribution is essential to our success. Matrix is committed to fostering a collaborative and inclusive work environment where learning, growth, and shared success thrive.

Join the winning team at Matrix! Here, you’ll find a challenging yet rewarding career, competitive compensation and benefits, and opportunities to be part of a highly respected organization—all while having fun along the way.

 

To Learn More, Visit: www.matrix-ifs.com

 

EQUAL OPPORTUNITY EMPLOYER: Matrix is an Equal Opportunity Employer and Prohibits Discrimination and Harassment of Any Kind. Matrix is committed to the principle of equal employment opportunity for all employees, providing employees with a work environment free of discrimination and harassment. All employment decisions at Matrix are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion or belief, family or parental status, or any other status protected by the laws or regulations in our locations. Matrix will not tolerate discrimination or harassment based on any of these characteristics. Matrix encourages applicants of all ages.


Requirements

Technical Skills:

Mandatory:

  • Strong understanding of Databricks, Apache Spark, and performance tuning techniques for distributed data processing systems.
  • Hands-on experience in Spark (PySpark/Scala) performance profiling, partitioning strategies, and job parallelization.
  • 2+ years of experience in performance testing and load simulation of data pipelines.
  • Solid skills in SQL, Snowflake, and analyzing performance via query plans and optimization hints.
  • Familiarity with Azure Databricks, Azure Monitor, Log Analytics, or similar observability tools.
  • Proficient in scripting (Python/Shell) for test automation and pipeline instrumentation.
  • Experience with DevOps tools such as Azure DevOps, GitHub Actions, or Jenkins for automated testing.
  • Comfortable working in Unix/Linux environments and writing shell scripts for monitoring and debugging.

Good to Have:

  • Experience with job schedulers like Control-M, Autosys, or Azure Data Factory trigger flows.
  • Exposure to CI/CD integration for automated performance validation.
  • Understanding of network/storage I/O tuning parameters in cloud-based environments.

Behavioral Skills:

  • Proactive, detail-oriented with a strong analytical mindset.
  • Excellent troubleshooting and root cause analysis capabilities.
  • Strong communication and collaboration skills with cross-functional teams.
  • Able to prioritize and manage multiple tasks effectively under tight deadlines.
  • Willingness to adapt, learn new tools, and take initiative.


Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Azure CI/CD Consulting Databricks Data pipelines DevOps ELT ETL Finance GitHub Jenkins Linux Pipelines PySpark Python Scala Security Snowflake Spark SQL Testing

Perks/benefits: Career development Competitive pay Gear

Region: Asia/Pacific
Country: India

More jobs like this