Azure Databricks Engineer

Ciudad de México, CDMX, MX

Sequoia Connect

Discover global tech talent through our IT headhunting services, connecting companies with top digital transformation with IT Advisory.

View all jobs at Sequoia Connect

Apply now Apply later

Description

Our client represents the connected world, offering innovative and customer-centric information technology experiences, enabling Enterprises, Associates, and Society to Rise™.

They are a USD 6 billion company with 163,000+ professionals across 90 countries, helping 1279 global customers, including Fortune 500 companies. They focus on leveraging next-generation technologies, including 5G, Blockchain, Metaverse, Quantum Computing, Cybersecurity, Artificial Intelligence, and more, on enabling end-to-end digital transformation for global customers.

Our client is one of the fastest-growing brands and among the top 7 IT service providers globally. Our client has consistently emerged as a leader in sustainability and is recognized amongst the ‘2021 Global 100 Most sustainable corporations in the World by Corporate Knights. 

We are currently searching for a Azure Databricks Engineer:

Responsibilities:

  • Design, implement, and optimize solutions on the Databricks platform.
  • Develop and maintain data pipelines using Delta Lake and Delta Live Tables.
  • Integrate Azure Purview with data cataloging and governance strategies.
  • Apply big data processing techniques using Apache Spark.
  • Collaborate with cross-functional teams to establish CI/CD pipelines and implement Infrastructure as Code with Terraform.

Requirements:

  • Proven expertise in the Databricks platform, including Unity Catalog and Delta Lake.
  • Proficiency in Python, SQL, and Scala.
  • Strong knowledge of Azure Purview and its integration.
  • Solid experience with Apache Spark and big data processing.
  • Familiarity with CI/CD pipelines and Terraform.
  • Databricks Certified Data Engineer Professional certification.
  • Experience with Azure cloud services.

Desired:

  • Knowledge of data mesh principles.
  • Experience in developing data quality frameworks.
  • Skills in data product lifecycle management.
  • Expertise in performance optimization and cost management.
  • ML Ops and advanced analytics support.

Languages

  • Advanced Oral English.
  • Native Spanish.

Note:

  • Fully remote

If you meet these qualifications and are pursuing new challenges, Start your application to join an award-winning employer. Explore all our job openings | Sequoia Career’s Page: https://www.sequoia-connect.com/careers/.

Requirements

None
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Azure Big Data Blockchain CI/CD Databricks Data pipelines Data quality Machine Learning Pipelines Python Scala Spark SQL Terraform

Regions: Remote/Anywhere North America
Country: Mexico

More jobs like this