Senior Data Engineer

(USA) MLK Building AR BENTONVILLE Home Office, United States

Walmart

What’s a career at Walmart or Sam’s Club like? To find out, explore our culture, our opportunities and the difference you can make.

View all jobs at Walmart

Apply now Apply later

What you'll do...

Position: Senior Data Engineer

Job Location: 702 SW 8th Street, Bentonville, AR 72716

Duties: Data Strategy: understand, articulate, and apply principles of the defined strategy to routine business problems that involve a single function. Data Source Identification support the understanding of the priority order of requirements and service level agreements. Helps identify the most suitable source for data that is fit for purpose. Performs initial data quality checks on extracted data. Data Transformation and Integration extract data from identified databases. Creates data pipelines and transform data to a structure that is relevant to the problem by selecting appropriate techniques. Develops knowledge of current data science and analytics trends. Tech. Problem Formulation translate/ co-own business problems within one's discipline to data related or mathematical solutions. Identifies appropriate methods/tools to be leveraged to provide a solution for the problem. Shares use cases and gives examples to demonstrate how the method would solve the business problem. Understanding Business Context provide recommendations to business stakeholders to solve complex business issues. Develops business cases s for projects with a projected return on investment or cost savings. Translates business requirements into projects, activities, and tasks and aligns to overall business strategy and develops domain specific artifact. Serves as an interpreter and conduit to connect business needs with tangible solutions and results. Identify and recommend relevant business insights pertaining to their area of work. Data Modeling analyze complex data elements, systems, data flows, dependencies, and relationships to contribute to conceptual, physical, and logical data models. Develops the Logical Data Model and Physical Data Models including data warehouse and data mart designs. Defines relational tables, primary and foreign keys, and stored procedures to create a data model structure. Evaluates existing data models and physical databases for variances and discrepancies. Develops efficient data flows. Analyzes data-related system integration challenges and proposes appropriate solutions. Creates training documentation and trains end-users on data modeling. Oversees the tasks of less experienced programmers and stipulates system troubleshooting supports. Code Development and Testing write code to develop the required solution and application features by determining the appropriate programming language and leveraging business, technical, and data requirements. Creates test cases to review and validate the proposed solution design. Creates proofs of concept. Tests the code using the appropriate testing approach. Deploys software to production servers. Contributes code documentation, maintains playbooks, and provides timely progress updates. Data Governance establish, modify, and document data governance projects and recommendations. Implements data governance practices in partnership with business stakeholders and peers. Interprets company and regulatory policies on data. Educates others on data governance processes, practices, policies, and guidelines.

Minimum education and experience required: Bachelor’s degree or the equivalent in Computer Science or a related field plus 3 years of experience in software engineering or related experience OR Master’s degree or the equivalent in Computer Science or a related field plus 1 year of experience in software engineering or related experience.

Skills required: Must have experience with: designing and implementing highly scalable, optimized, distributed big data pipelines/ETLs using frameworks like Hadoop, Hive, Spark/Databricks; following industry standard coding, designing and developing practices to implement applications and reusable components using Object oriented programming languages like Scala, Python and Java; writing code using design principles and algorithms in solving complex problems; cloud platforms/services such as Microsoft Azure, Google Cloud Platform (GCP), or Amazon Web services (AWS) for implementations; orchestrating and automating implemented pipeline solutions using tools/frameworks such as Automic, Oozie, Conductor or Apache Airflow; setting up CI/CD (Continuous Integration Continuous Deployment) processes by leveraging Jenkins/Looper; data modelling of tables being created on systems RDBMS, Apache Hive for better consumption; communicating technical solutions and ideas to both technical and non-technical team members, implementing ideas and taking them to the production scale; using shell scripting to automate and streamline manual processes and build small ETLs/frameworks; using version control system(s) Git, GitHub for collaborative code management, tracking changes, code review, ensuring effective teamwork and codebase integrity; and designing and building Data Lake, architecture of distributed systems, distributed computing solutions. Employer will accept any amount of experience with the required skills.

#LI-DNP #LI-DNI

Wal-Mart is an Equal Opportunity Employer.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture AWS Azure Big Data CI/CD Computer Science Databricks Data governance Data pipelines Data quality Data strategy Data warehouse Distributed Systems Engineering ETL GCP Git GitHub Google Cloud Hadoop Java Jenkins Oozie Pipelines Python RDBMS Scala Shell scripting Spark Testing

Perks/benefits: Team events

Region: North America
Country: United States

More jobs like this