Data Engineer III

(USA) AR BENTONVILLE Home Office ISD Office - DGTC, United States

Walmart

What’s a career at Walmart or Sam’s Club like? To find out, explore our culture, our opportunities and the difference you can make.

View all jobs at Walmart

Apply now Apply later

What you'll do...

Position: Data Engineer III

Job Location: 702 SW 8th Street, Bentonville, AR 72716

Duties: Problem Formulation: Identifies possible options to address the business problems within one's discipline through analytics, big data analytics, and automation. Applied Business Acumen: Supports the development of business cases and recommendations. Owns delivery of project activity and tasks assigned by others. Supports process updates and changes. Solves business issues. Data Governance: Supports the documentation of data governance processes. Supports the implementation of data governance practices. Data Strategy: Understands, articulates, and applies principles of the defined strategy to routine business problems that involve a single function. Data Transformation and Integration: Extracts data from identified databases. Creates data pipelines and transform data to a structure that is relevant to the problem by selecting appropriate techniques. Develops knowledge of current data science and analytics trends. Data Source Identification: Supports the understanding of the priority order of requirements and service level agreements. Helps identify the most suitable source for data that is fit for purpose. Performs initial data quality checks on extracted data. Data Modeling: Analyzes complex data elements, systems, data flows, dependencies, and relationships to contribute to conceptual, physical, and logical data models. Develops the Logical Data Model and Physical Data Models including data warehouse and data mart designs. Defines relational tables, primary and foreign keys, and stored procedures to create a data model structure. Evaluates existing data models and physical databases for variances and discrepancies. Develops efficient data flows. Analyzes data-related system integration challenges and proposes appropriate solutions. Creates training documentation and trains end-users on data modeling. Oversees the tasks of less experienced programmers and stipulates system troubleshooting supports. Code Development and Testing: Writes code to develop the required solution and application features by determining the appropriate programming language and leveraging business, technical, and data requirements. Creates test cases to review and validate the proposed solution design. Creates proofs of concept. Tests the code using the appropriate testing approach. Deploys software to production servers. Contributes code documentation, maintains playbooks, and provides timely progress updates.

Minimum education and experience required: Bachelor’s degree or the equivalent in Computer Science or a related field plus 2 years of experience in software engineering or related OR Master’s degree or the equivalent in Computer Science or a related field OR 4 years of experience in software engineering or related.

Skills required: Must have experience with: designing and performing transformations clean and filter on imported data using Hive, Map Reduce, spark and loading final data into HDFS; Spark SQL and Dataframes to develop functional programming code involving complex transformations  using the in-memory computing capabilities of Spark; converting Hive Queries into various Spark Actions and Transformations by Creating RDD's from the required files in HDFS; HDFS: Performing architecture of Hadoop, map reduce concepts, and big data ingestion framework to load data on to  Cloud; designing and implementing Acceptance Test Driven Development automation framework with Java Spring Boot libraries; Agile and Lean Methodology and using JIRA;  writing data movements scripts/procedures/Triggers between oracle and publishing to the end user with Teradata and SQL; optimizing the load performance and query performance for ETL jobs by tuning the SQL used in Transformations and fine-tuning the database; performing various scenarios like file watcher and automated validations for data quality with Shell Script; designing and developing UNIX scripts for creating, dropping tables, and splitting file into Header, Detail and Trailer Files; and coding in functional programming using Scala and Code coverage tests using Junit and other Spark/Scala libraries. Employer will accept any amount of experience with the required skills.

#LI-DNP #LI-DNI

Wal-Mart is an Equal Opportunity Employer.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Architecture Big Data Computer Science Data Analytics Data governance Data pipelines Data quality Data strategy Data warehouse Engineering ETL Hadoop HDFS Java Jira Map Reduce Oracle Pipelines Scala Spark SQL Teradata Testing

Region: North America
Country: United States

More jobs like this