Senior Systems and Infrastructure Engineer
(USA) Bentonville Global Tech AR BENTONVILLE Home Office, United States
Walmart
What’s a career at Walmart or Sam’s Club like? To find out, explore our culture, our opportunities and the difference you can make.What you'll do...
Position: Senior Systems and Infrastructure Engineer
Job Location: 702 SW 8th Street, Bentonville, AR 72716
Duties: Technology Solution Automation: employs automation tools and technologies to automate repetitive tasks and releases for a more efficient workflow. Requirement And Scoping Analysis: utilizes traceability matrix, risk analysis methodologies, cost analysis, business objectives, classification of requirements, and user stories to explore relevant products/solutions from an existing repertoire, that can address business/technical needs; assesses gaps/updates/modifications between the customer/business expectations and the existing product/solutions (in case of agile methodology, for the iteration); classifies the requirements into applicable types; anticipates the solution risks/issues during requirements gathering phase, inform relevant stakeholders and recommend corrective steps; contributes to the creation of user stories for component/application/complex (For agile methodology). Infrastructure Maintenance: Performs routine maintenance tasks for infrastructure systems such as backups, patch management and hot fixes with infrastructure maintenance tools and methodologies, infrastructure maintenance plans and schedule, and infrastructure performance metrics; escalates any issue that occurs in the backup media; audits desktops for compliance with IT policies; conducts regular database integrity checks to ensure minimal data loss. Coding: employs coding standards and guidelines, coding languages, frameworks, tools and platforms, Quality, Safety and Security standards, emerging tools and technologies, and telemetry to create/configure minimalistic code for entire component/application and ensure the components are meeting business/technical requirements, non-functional requirements, low maintainability, high-availability and high-scalability needs; assists in the selection of appropriate languages, development standards and tools for software coding/configuration; builds scripts for automation of repetitive and routine tasks in CI/CD (Continuous Integration/Continuous Delivery), Testing or any other process (as applicable); implements telemetry features as required independently; ensures security policy requirements are properly applied to components/application during code development/configuration. Issue Resolution analyzes and prioritizes issues under moderate supervision for projects of moderate complexity; independently identifies and elaborates possible and feasible solutions to the issues raised; evaluates all available options to resolve the raised issues for projects of moderate complexity. Capacity Management: adjusting IT resources to meet current and future business requirements in a cost-effective manner within a domain/pillar; optimizes IT resource utilization; creates a model of infrastructure performance to manage current resource needs; manages demand for computing resources; produces a capacity plan that covers current and forecasted needs. Infrastructure Design: uses software architecture, distributed systems, scalability, design patterns, disaster recovery, TechStacks, Non-Functional Requirements, security standards, frameworks, and methodologies (System Security Plan -SSP, Security Risk and Compliance Review- SRCR etc.) to assist in creation of simple, modular, extensible and functional design for the product/solution in adherence to the requirements; evaluates trade-offs while designing across multiple components in a system based on the business requirements; converts HLD to create detailed design for specific modules/components of a product/system. Cloud Migration: supports decommission of the on-prem hardware or converts it into a hybrid set-up or complete cloud setup as per business needs, including monitoring cloud’s resiliency and reliability; and supporting the planning and execution of data-center consolidations, relocations, and migrations.
Minimum education and experience required: Bachelor's degree or the equivalent in Computer Science, Computer Engineering, Information Systems, Information Technology, or related field plus 3 years of experience in technology infrastructure engineering across areas such as compute, storage, network, mobility or virtualization-related technologies or related experience; OR 5 years of experience in technology infrastructure engineering across areas such as compute, storage, network, mobility or virtualization-related technologies or related experience.
Skills required: Must have experience with: designing and implementing highly scalable, optimizing, and distributing big data pipelines and ETLs using frameworks like Hadoop, Hive, Spark, and Databricks; developing ETL pipelines using Scala, Python, Spark, and SQL; SQL and Spark SQL querying for data analysis and validation; Spark shell scripting for automated data validation; documenting data pipelines and workflows like Confluence, Jira, and Azure DevOps; orchestrating and automating implemented pipeline solutions using tools and frameworks like Automic, AutoSys, Oozie, and Apache Airflow; writing SQLs for data querying from RDBMS and Data warehousing frameworks like Apache Hive and Snowflake; developing and maintaining modules in cloud ecosystem in Tech like Google Cloud, AWS, and Azure; working with Parquet, AVRO, CSV, ORC, and JSON; working in different file-based storage systems, including HDFS, GCS, Azure Blob Storage, and S3; creating and automating dashboards by using Power BI, Tableau, and Looker; leading large-scale projects and providing guidance to team members. Employer will accept any amount of experience with the required skills.
#LI-DNP #LI-DNI
Wal-Mart is an Equal Opportunity Employer.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture Avro AWS Azure Big Data CI/CD Classification Computer Science Confluence CSV Data analysis Databricks Data pipelines Data Warehousing DevOps Distributed Systems Engineering ETL GCP Google Cloud Hadoop HDFS Jira JSON Looker Oozie Parquet Pipelines Power BI Python RDBMS Scala Security Shell scripting Snowflake Spark SQL Tableau Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.