Intermediate Application Developer -Azure (Databricks, Data Factory, Data Lake Storage, Synapse Analytics).
IN - TDC 1 (IN110), India
UPS
Discover fast, reliable global shipping and logistics solutions with UPS. Explore our shipping and tracking services and streamline your supply chain today.Before you apply to a job, select your language preference from the options available at the top right of this page.
Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level.
Job Description:
Job Summary
This position provides input and support for full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She performs tasks within planned durations and established deadlines. This position collaborates with teams to ensure effective communication and support the achievement of objectives. He/She provides knowledge, development, maintenance, and support for applications.
Responsibilities:
- Generates application documentation.
- Contributes to systems analysis and design.
- Designs and develops moderately complex applications.
- Contributes to integration builds.
- Contributes to maintenance and support.
- Monitors emerging technologies and products.
Technical Skills :
- Cloud Platforms: Azure (Databricks, Data Factory, Data Lake Storage, Synapse Analytics).
- Data Processing: Databricks (PySpark, Spark SQL), Apache Spark.
- Programming Languages: Python, SQL
- Data Engineering Tools: Delta Lake, Azure Data Factory, Apache Airflow
- Other: Git, CI/CD
Professional Experience :
- Design and implementation of a scalable data lakehouse on Azure Databricks, optimizing data ingestion, processing, and analysis for improved business insights.
- Develop and maintain efficient data pipelines using PySpark and Spark SQL for extracting, transforming, and loading (ETL) data from diverse sources.(Azure and GCP).
- Develop SQL stored procedures for data integrity. Ensure data accuracy and consistency across all layers.
- Implement Delta Lake for ACID transactions and data versioning, ensuring data quality and reliability.
- Create frameworks using Databricks and Data Factory to process incremental data for external vendors and applications.
- Implement Azure functions to trigger and manage data processing workflows.
- Design and implement data pipelines to integrate various data sources and manage Databricks workflows for efficient data processing.
- Conduct performance tuning and optimization of data processing workflows.
- Provide technical support and troubleshooting for data processing issues.
- Experience with successful migrations from legacy data infrastructure to Azure Databricks, improving scalability and cost savings.
- Collaborate with data scientists and analysts to build interactive dashboards and visualizations on Databricks for data exploration and analysis.
Effective oral and written management communication skills.
Qualifications:
- Minimum 5 years of Relevant experience
- Bachelor’s Degree or International equivalent
- Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics or related field
Employee Type:
UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Azure CI/CD Computer Science Databricks Data pipelines Data quality Engineering ETL GCP Git Mathematics Pipelines PySpark Python Spark SQL Statistics Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.