Data Engineer - GCP
IN - TDC 1 (IN110), India
UPS
Discover fast, reliable global shipping and logistics solutions with UPS. Explore our shipping and tracking services and streamline your supply chain today.Before you apply to a job, select your language preference from the options available at the top right of this page.
Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level.
Job Description:
JOB SUMMARY
This position develops batch and real-time data pipelines utilizing various data analytics processing frameworks in support of Data Science and Machine Learning practices. This position assists in the integration of data from various data sources, both internal and external. This position performs extract, transform, load (ETL) data conversions, and facilitates data cleansing and enrichment as well as performs full systems life cycle management activities, such as analysis, technical requirements, design, coding, testing, implementation of systems and applications software. This position assists in synthesizing disparate data sources to create reusable and reproducible data assets. This position assists the Data Science community working through analytical model feature tuning.
RESPONSIBILITIES
• Contributes to data engineering projects and builds solutions by leveraging a foundational knowledge in software/application development, literate in the programming languages used for statistical modeling and analysis, data warehousing and Cloud solutions, and building data pipelines.
• Collaborates effectively, produces data engineering documentation, gathers requirements, organizes data and defines the scope of a project.
• Performs data analysis and presents findings to the stakeholders to support business needs.
• Participates in the integration of data for data engineering projects.
•Understands and utilizes analytic reporting tools and technologies.
• Assists with data engineering maintenance and support.
• Assists in defining the data interconnections between organizations’ operational and business functions.
• Assists in backup and recovery and utilizes technology solutions to perform POC analysis.
QUALIFICATIONS
Requirements:
• Understanding of database systems and data warehousing solutions.
• Understanding of data life cycle stages - data collection, transformation, analysis, storing the data securely, providing data accessibility
• Understanding of the data environment to ensure that it can scale for the following demands: Throughput of data, increasing data pipeline throughput, analyzing large amounts of data, real-time predictions, insights and customer feedback, data security, data regulations, and compliance.
• Contributes to the following: Building a data platform, ensuring data is secure in motion and at rest, automating data compliance and auditing, data warehousing solutions for scalable analytics.
• Familiarity with analytics reporting technologies and environments – (e.g., PBI, Looker, Qlik, etc.)
• Basic knowledge of algorithms and data structures to assist in understanding the big picture of the organization’s overall data function. Knowledge in data filtering and data optimization.
• Familiarity with a Cloud services platform (e.g., GCP, or AZURE, or AWS) and all the data life cycle stages.
• Understanding ETL tools capabilities. Ability to pull data from various sources, perform a load of the transformed data into a database or business intelligence platform.
• Familiarity with Machine learning algorithms which help data scientists make predictions based on current and historical data.
• Knowledge of algorithms and data structures with the ability to organize the data for reporting, analytics, and data mining and perform data filtering and data optimization.
• Ability to build data APIs to enable data scientists and business intelligence analysts to query the data.
• Ability to code using programming language used for statistical analysis and modeling such as Python/Java/Scala/C++.
• Understanding the basics of distributed systems.
• A Bachelor’s degree in MIS or mathematics, statistics, or computer science, international equivalent, or equivalent job experience.
Employee Type:
UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: APIs AWS Azure Business Intelligence Computer Science Data analysis Data Analytics Data Mining Data pipelines Data Warehousing Distributed Systems Engineering ETL GCP Java Looker Machine Learning Mathematics Pipelines Python Qlik Scala Security Statistical modeling Statistics Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.