Data Engineer
California
Full Time Mid-level / Intermediate USD 103K - 143K
CDC Foundation
The CDC Foundation is a global nonprofit, managing public health programs that impact chronic and infectious diseases and emergency threats like COVID-19.Job HighlightsLocation: Remote, must be based in the United StatesWork Hours: 8am- 5pm Pacific time zone is preferred, but working hours are flexible.Salary Range: $103,500-$143,500 per year, plus benefits. Individual salary offers will be based on experience and qualifications unique to each candidate.Position Type: Grant funded, limited-term opportunityPosition End Date: June 30, 2025
OverviewThe Data Engineer will play a crucial role in advancing the CDC Foundation's mission by designing, building, and maintaining data infrastructure for a public health organization. This role is aligned to the Workforce Acceleration Initiative (WAI). WAI is a federally funded CDC Foundation program with the goal of helping the nation’s public health agencies by providing them with the technology and data experts they need to accelerate their information system improvements.
Working within the Science Branch in the Santa Clara County Public Health Department (SCC PHD), the Data Engineer will deliver the architecture needed for data generation, storage, processing, and analysis. The Data Engineer will collaborate with data content experts, analysts, epidemiologists, data scientists, data modelers, warehouse architects, IT staff and other organization staff to design and implement proposed solutions and architectures that meet the needs of the public health agency.
The Data Engineer will be hired by the CDC Foundation and assigned to the Science Branch, SCC PHD. This position is eligible for a fully remote work arrangement for U.S. based candidates.
Responsibilities
- Create and manage the systems and pipelines that enable efficient and reliable flow of data, including ingestion, processing, and storage.
- Collect data from various sources, transforming and cleaning it to ensure accuracy and consistency. Load data into storage systems and/or data warehouses.
- Optimize data pipelines, infrastructure, and workflows for performance and scalability.
- Monitor data pipelines and systems for performance issues, errors, and anomalies, and implement solutions to address them.
- Implement security measures to protect sensitive information.
- Collaborate with data scientists, analysts, and other partners to understand their data needs and requirements, and to ensure that the data infrastructure supports the organization's goals and objectives.
- Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs.
- Implement and maintain ETL processes to ensure the accuracy, completeness, and consistency of data.
- Design and manage data storage systems, including relational databases, NoSQL databases, and data warehouses.
- Knowledgeable about industry trends, best practices, and emerging technologies in data engineering, and incorporating the trends into the organization's data infrastructure.
- Provide technical guidance to other staff.
- Communicate effectively with partners at all levels of the organization to gather requirements, provide updates, and present findings.
Qualifications
- Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field.
- Minimum 5 years of relevant professional experience
- Proficiency in programming languages commonly used in data engineering, such as Python, Java, Scala, or SQL. Candidate should be able to implement data automations within existing frameworks as opposed to writing one off scripts.
- Experience with big data technologies and frameworks like Hadoop, Spark, Kafka, and Flink.
- Strong understanding of database systems, including relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
- Experience regarding engineering best practices such as source control, automated testing, continuous integration and deployment, and peer review.
- Knowledge of data warehousing concepts and tools.
- Experience with cloud computing platforms.
- Expertise in data modeling, ETL (Extract, Transform, Load) processes, and data integration techniques.
- Familiarity with agile development methodologies, software design patterns, and best practices.
- Strong analytical thinking and problem-solving abilities.
- Excellent verbal and written communication skills, including the ability to convey technical concepts to non-technical partners effectively.
- Flexibility to adapt to evolving project requirements and priorities.
- Outstanding interpersonal and teamwork skills; and the ability to develop productive working relationships with colleagues and partners.
- Experience working in a virtual environment with remote partners and teams
- Proficiency in Microsoft Office.
All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of race, color, religion, sex, national origin, age, mental or physical disabilities, veteran status, and all other characteristics protected by law.
We comply with all applicable laws including E.O. 11246 and the Vietnam Era Readjustment Assistance Act of 1974 governing employment practices and do not discriminate on the basis of any unlawful criteria in accordance with 41 C.F.R. §§ 60-300.5(a)(12) and 60-741.5(a)(7). As a federal government contractor, we take affirmative action on behalf of protected veterans.
The CDC Foundation is a smoke-free environment. Relocation expenses are not included.
Tags: Agile Architecture Big Data Cassandra Computer Science Data pipelines Data Warehousing Engineering ETL Flink Hadoop Java Kafka MongoDB MySQL Nonprofit NoSQL PhD Pipelines PostgreSQL Python R RDBMS Scala Security Spark SQL Testing
Perks/benefits: Flex hours Flex vacation Health care Relocation support
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.