Senior Data Engineer

Manila, Philippines

Globant Commerce Studio

Our Digital Commerce Solutions transform your experience and unlock new growth digital sales with AI and next-gen omnichannel ecosystems.

View all jobs at Globant Commerce Studio

Apply now Apply later

Company Description

Globant’s Commerce Studio helps organizations create best-in-class commerce-enabled experiences, with engineering and design at its core. The goal is to meet the demands of tomorrow's customers, leveraging long-standing expertise with large and complex commerce transformations in both B2B, B2C and D2C domains.

As an award-winning partner of enterprise-class platforms: Salesforce Commerce Cloud, Adobe Magento Commerce, and other API first, Headless Commerce surround solutions, we help clients create a competitive advantage with commerce at its core.

Our mission is to empower companies to succeed and thrive in the ever-changing digital landscape by building best-in-class future-ready digital commerce solutions globally.

Job Description

  • Design, build, and maintain scalable data pipelines using PySpark and Databricks
  • Optimize data processing and storage for maximum performance and efficiency
  • Troubleshoot and debug data-related issues, and implement solutions to prevent reoccurrence
  • Collaborate with data scientists, software engineers, and other stakeholders to ensure that data solutions are aligned with business goals

Qualifications

  • Strong experience in Python programming and PySpark, and SparkSQL
  • Clear understanding of Spark Data structures, RDD, Dataframe, dataset
  • Expertise in Databricks and ADLS
  • Expertise handling data type, from dictionaries, lists, tuples, sets, arrays, pandas dataframes, and spark dataframes
  • Expertise working with complex data types such as, structs, and JSON strings.
  • Clear understanding of Spark Broadcast, Repartition, Bloom index filters
  • Experience with ADLS optimization, partitioning, shuffling and shrinking
  • Ideal experience with disk caching
  • Ideal Experience with cost based optimizer
  • Experience with data modeling, data warehousing, data-lake, delta-lake and ETL/ELT processes in ADF
  • Strong analytical and problem-solving skills
  • Excellent documentation, communication and collaboration skills

Additional Information

  • Work with professionals who have created some of the most revolutionary solutions in their fields.
  • Make an impact. Work in large-scale projects globally.
  • Develop your career in our Studios. Each Studio represents deep pockets of expertise on the latest technologies and trends and delivers tailored solutions focused on specific challenges.
  • Develop your career within an industry or multiple industries.
  • Work in the city you want, and be nourished by cultural exchanges.
  • Be empowered to choose your career path: we have more than 600 simultaneous projects, so you can choose where and how to work.
  • Be part of an agile pod. Driven by a culture of self-regulated teamwork, each team -or POD- works directly with our customers with a full maturity path that evolves as they increase speed, quality and autonomy.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile APIs Databricks Data pipelines Data Warehousing ELT Engineering ETL JSON Pandas Pipelines PySpark Python Salesforce Spark

Region: Asia/Pacific
Country: Philippines

More jobs like this