Data Engineer - Data Analytics
Issaquah, WA, US
Full Time Senior-level / Expert USD 130K - 160K
Costco Wholesale
Costco IT is responsible for the technical future of Costco Wholesale, the third largest retailer in the world with wholesale operations in fourteen countries. Despite our size and explosive international expansion, we continue to provide a family, employee centric atmosphere in which our employees thrive and succeed. As proof, Costco ranks eighth in Forbes “World’s Best Employers”.
This is an environment unlike anything in the high-tech world and the secret of Costco’s success is its culture. The value Costco puts on its employees is well documented in articles from a variety of publishers including Bloomberg and Forbes. Our employees and our members come FIRST. Costco is well known for its generosity and community service and has won many awards for its philanthropy. The company joins with its employees to take an active role in volunteering by sponsoring many opportunities to help others.
Come join the Costco Wholesale IT family. Costco IT is a dynamic, fast-paced environment, working through exciting transformation efforts. We are building the next generation retail environment where you will be surrounded by dedicated and highly professional employees.
Data Engineers are responsible for developing and operationalizing data pipelines/integrations to make data available for consumption (i.e. Reporting, Data Science/Machine Learning, Data APIs, etc.). This includes data ingestion, data transformation, data validation/quality, data pipeline optimization, orchestration; and deploying code to production via CI/CD.
The Data Engineer role requires knowledge of software development/programming methodologies, various data sources (Relational Databases, flat files (csv, delimited), APIs, XML, JSON, etc.), data access (SQL, Python, etc.), followed by expertise in data modeling, cloud architectures/platforms, data warehousing, and data lakes. This role also will partner closely with Product Owners, Data Architects, Platform/DevOps Engineers, etc. to design, build, test, implement, and maintain data pipelines.
The Data Engineer - Data Analytics is responsible for the end-to-end data pipelines to power analytics and data services. This role is focused on data engineering to build and deliver automated data pipelines from a plethora of internal and external data sources. The Data Engineer will partner with Product Owners, Engineering, and Data Platform teams to design, build, test, and automate data pipelines that are relied upon across the company as the single source of truth.
If you want to be a part of one of the worldwide BEST companies “to work for”, simply apply and let your career be reimagined.
ROLE
● Develops complex SQL & Python against a variety of data sources.
● Implements streaming data pipelines using event/message-based architectures.
● Demonstrates ability to communicate technical concepts to non-technical audiences both in written and verbal form.
● Works in tandem with Data Architects to align on data architecture requirements provided by the requestor.
● Defines and maintains optimal data pipeline architecture.
● Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery/orchestration.
● Demonstrates strong understanding with coding and programming concepts to build data pipelines (e.g. data transformation, data quality, data integration, etc.).
● Analyzes data to spot anomalies, trends, and correlate data to ensure data quality.
● Develops data pipelines to store data in defined data models/structures.
● Demonstrates strong understanding of data integration techniques and tools (e.g. Extract, Transform, Load (ETL) / Extract, Load, Transform (ELT)) tools.
● Demonstrates strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
● Performs peer review for another Data Engineer’s work.
● Assembles large, complex data sets to meet business requirements.
● Implements big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization.
● Supports development of Data Dictionaries and Data Taxonomy for product solutions.
● Builds data models with Data Architect and develops data pipelines to store data in defined data models and structures.
● Conducts AD-HOC data retrieval for business reports and dashboards.
● Assesses the integrity of data from multiple sources.
● Manages database configuration including installing and upgrading software, and maintaining relevant documentation.
● Monitors database activity and resource usage.
● Develops and operationalizes data pipelines to make data available for consumption (BI, Advanced analytics, Services).
● Works in tandem with data architects and data/BI engineers to design data pipelines and recommends ongoing optimization of data storage, data ingestion, data quality, and orchestration.
● Designs, develops, and implements ETL/ELT processes using IICS (Informatica cloud).
● Uses Azure services, such as Azure SQL DW (Synapse), ADLS, Azure Event Hub, Azure Data Factory to improve and speed up delivery of our data products and services.
● Communicates technical concepts to non-technical audiences both in written and verbal form.
REQUIRED
● 5+ years’ experience engineering and operationalizing data pipelines with large and complex datasets.
● 5+ years’ of hands-on experience with Informatica PowerCenter.
● 5+ years’ experience with Data Modeling, ETL, and Data Warehousing.
● 2+ years’ of hands-on experience with Informatica IICS.
● 3+ years’ experience working with Cloud technologies, such as ADLS, Azure Data Factory, Azure Databricks, Databricks Live Table, Spark, Azure Synapse, Cosmos DB, and other big data technologies.
● Extensive experience working with various data sources (SQL,Oracle database, flat files (csv, delimited), Web API, XML.
● Advanced SQL skills. Solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources.
● Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
● Able to work in a fast-paced agile development environment.
● Scheduling flexibility to meet the needs of the business including weekends, holidays, and 24/7 on call responsibilities on a rotational basis.
Recommended
● BA/BS in Computer Science, Engineering, or equivalent software/services experience.
● Azure Certifications.
● Experience implementing data integration techniques, such as event/message-based integration (Kafka, Azure Event Hub), ETL.
● Experience with Git / Azure DevOps.
● Experience delivering data solutions through agile software development methodologies.
● Exposure to the retail industry.
● Excellent verbal and written communication skills.
● Experience working with SAP integration tools including BODS.
● Experience with UC4 Job Scheduler.
● Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail.
Required Documents
● Cover Letter
● Resume
California applicants, please click here to review the Costco Applicant Privacy Notice.
Pay Range: $130,000 - $160,000
We offer a comprehensive package of benefits including paid time off, health benefits - medical/dental/vision/hearing aid/pharmacy/behavioral health/employee assistance, health care reimbursement account, dependent care assistance plan, short-term disability and long-term disability insurance, AD&D insurance, life insurance, 401(k), stock purchase plan to eligible employees.
Costco is committed to a diverse and inclusive workplace. Costco is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or any other legally protected status. If you need assistance and/or a reasonable accommodation due to a disability during the application or the recruiting process, please send a request to IT-Recruiting@costco.com
If hired, you will be required to provide proof of authorization to work in the United States. In some cases, applicants and employees for selected positions will not be sponsored for work authorization, including, but not limited to H1-B visas.
Tags: Agile APIs Architecture Azure Big Data CI/CD Computer Science Cosmos DB CSV Data Analytics Databricks Data pipelines Data quality Data Warehousing DevOps ELT Engineering ETL Git Informatica JSON Kafka Machine Learning NoSQL Oracle Pipelines Privacy Python RDBMS Spark SQL Streaming XML
Perks/benefits: Career development Equity / stock options Health care Insurance
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.