Senior Data Engineer

(USA) Global People Center (GPC) AR Bentonville Home Office, United States

Walmart

What’s a career at Walmart or Sam’s Club like? To find out, explore our culture, our opportunities and the difference you can make.

View all jobs at Walmart

Apply now Apply later

What you'll do...

Position: Senior Data Engineer 

 

Job Location: 702 SW 8th Street, Bentonville, AR 72716 

 

Duties: Data Strategy: Understands, articulates, and applies principles of the defined strategy to routine business problems that involve a single function. Data Transformation and Integration: Extracts data from identified databases. Creates data pipelines and transform data to a structure that is relevant to the problem by selecting appropriate techniques. Develops knowledge of current data science and analytics trends. Data Source Identification: Supports the understanding of the priority order of requirements and service level agreements. Helps identify the most suitable source for data that is fit for purpose. Performs initial data quality checks on extracted data. Data Modeling: Analyzes complex data elements, systems, data flows, dependencies, and relationships to contribute to conceptual, physical, and logical data models. Develops the Logical Data Model and Physical Data Models including data warehouse and data mart designs. Defines relational tables, primary and foreign keys, and stored procedures to create a data model structure. Evaluates existing data models and physical databases for variances and discrepancies. Develops efficient data flows. Analyzes data-related system integration challenges and proposes appropriate solutions. Creates training documentation and trains end-users on data modeling. Oversees the tasks of less experienced programmers and stipulates system troubleshooting supports. Code Development and Testing: Writes code to develop the required solution and application features by determining the appropriate programming language and leveraging business, technical, and data requirements. Creates test cases to review and validate the proposed solution design. Creates proofs of concept. Tests the code using the appropriate testing approach. Deploys software to production servers. Contributes code documentation, maintains playbooks, and provides timely progress updates. Problem Formulation: Translates business problems within one's discipline to data related or mathematical solutions. Identifies what methods (for example, analytics, big data analytics, automation) would provide a solution for the problem. Shares use cases and gives examples to demonstrate how the method would solve the business problem. Applied Business Acumen: Provides recommendations to business stakeholders to solve complex business issues. Develops business cases for projects with a projected return on investment or cost savings. Translates business requirements into projects, activities, and tasks and aligns to overall business strategy. Serves as an interpreter and conduit to connect business needs with tangible solutions and results. Recommends new processes and ways of working. Data Governance: Establishes, modifies, and documents data governance projects and recommendations. Implements data governance practices in partnership with business stakeholders and peers. Interprets company and regulatory policies on data. Educates others on data governance processes, practices, policies, and guidelines. Provides recommendations on needed updates or inputs into data governance policies, practices, or guidelines. 

 

Minimum education and experience required: Master’s degree or the equivalent in Computer Science, Information Technology, Engineering, or a related field plus 1 year of experience in software engineering or related experience OR Bachelor's degree or the equivalent in Computer Science, Information Technology, Engineering, or a related field plus 3 years of experience in software engineering or related experience OR 5 years of experience in software engineering or related experience. 

 

Skills required: Must have experience with: Coding in Scala using Spark framework to develop ETL Pipeline; Coding in Python to develop Astronomer DAGs; Continuous Implementation and Continuous Design CI/CD using Jules/Jenkins/looper and Designing Data Modesl using Erwin; Developing Big Data Ecosystems (Hadoop, HDFS) to store and process ETL pipelines; Writing complex database queries using SQL language on Oracle, Postgres, Redshift, Hive, and Databases; Scheduling and Monitoring ETL Batch Jobs using job schedulers and maintaining the codebase in Version Control Management tools like Git/Bitbucket; Developing, optimizing, and delivering Enterprise level data pipelines, data lake frameworks and analytical solutions using private and public Clouds (Google/AWS/Azure); Infrastructure design using High Availability, Resiliency disaster recovery procedures for critical applications; Estimating cost, sizing and implementation plans of ETL projects and monitoring MPP architecture project implementation cost; Preparing Roadmaps and Goals in Jira across teams to meet Business objectives; and Data migration from source systems like IBM DB2 and transforming the data as required using ETL tools (Python, Scala, Stored procedures, Dataproc) and loading it into data lake. Employer will accept any amount of experience with the required skills. 

 

Wal-Mart is an Equal Opportunity Employer. 

#LI-DNP #LI-DNI

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture AWS Azure Big Data Bitbucket CI/CD Computer Science Data Analytics Data governance Data pipelines Dataproc Data quality Data strategy Data warehouse DB2 Engineering ETL Git Hadoop HDFS Jenkins Jira MPP Oracle Pipelines PostgreSQL Python Redshift Scala Spark SQL Testing

Region: North America
Country: United States

More jobs like this