Data Engineer (Remote)
Nationwide Remote Office (US99)
ICF
We make big things possible for our clients. We provide data, insights, and deep implementation expertise they need to deliver results that matter.ICF is growing, and we are looking for a talented, highly motivated Data Engineer to join ICF’s Digital Modernization and Experience (DMX) Division. The Data Engineer will support our projects by collaborating and working with a cross-functional team to deliver quality, innovative, and highly scalable applications to end clients. The ideal candidate will have prior experience working with cross-disciplinary teams in the government space.
You are a team player and can embrace ambiguity to translate client requirements using an agile delivery framework to deliver technical solutions that achieve mission value and measurable outcomes. You are a strategic problem-solver that has a willingness to learn and grow with our team and be part of our client’s digital transformation journey.
At ICF, we implement smart solutions to drive digital transformation at scale and speed, relentless in making sure it pays off for our clients and committed to helping them achieve mission outcomes. Join our community of mission-driven project managers, data experts, strategists, and technologists to build agile solutions that meet our client’s changing needs.
What will you be doing?
Lead and develop data engineering design and build efforts for our client initiative, working as part of an extended system development team on project execution
Develop extract, transform, and load (ETL) processing routines and data engineering pipelines, creating necessary data structures and data models to support data at all stages
Perform extensive data profiling and analysis based on client data
Work with UI, UX and business analysis team members and the client to define data and reporting requirements
Design and implement custom data analytic and BI/reporting products, custom reports, and data visualization products
Work closely with the client to understand and address their needs, ensuring seamless communication and delivery
Write, optimize, and maintain complex SQL queries for querying large datasets efficiently.
Quickly learn and navigate through databases to understand structure, optimize performance, and ensure data quality.
Develop and maintain SAS scripts and stored processes, ensuring their accuracy and performance.
Work with cloud-based platforms such as Snowflake and AWS (e.g., S3, Lambda) to integrate, process, and manage data.
Troubleshoot and resolve data-related issues, optimizing processes for speed and efficiency.
Provide insights and recommendations on improving the performance and architecture of data systems.
Document technical processes, queries, and database configurations for internal and client use.
What you must have:
US Citizenship or Green Card Holder is required (required by federal government for this position)
Bachelor’s degree (e.g., Computer Science, Engineering or related discipline)
2+ years of hands-on experience specifically in AWS and Big Data technologies such as S3, Lambda, Glue, Athena, Python, R-Studio, SageMaker, Eventbridge, LakeFormation, Redshift, RDS, ECS, etc.
3+ years of experience developing ETL, and data engineering pipelines using AWS services, Python
3+ years of experience in writing complex SQLs, performance tuning queries
3+ years of hands-on experience in programming languages such as Python, SQL, SAS.
2+ years of experience with AWS Redshift or Snowflake
What we’d like you to know:
Demonstrated experience showing strong critical thinking and problem-solving skills paired with a desire to take initiative.
Experience building CI/CD pipelines with tools such as Jenkins.
Experience with Agile development process
Strong proficiency in SQL, including the ability to write complex queries and optimize them for performance.
Hands-on experience with SAS programming (reading/writing SAS scripts) and creating/managing SAS stored processes.
Knowledge of cloud-based platforms, particularly Snowflake and AWS services (e.g., S3, Redshift, EC2, Lambda).
Ability to quickly learn and adapt to new databases and technologies.
Excellent communication skills, with the ability to work closely with clients and internal teams to gather requirements and deliver results.
Strong problem-solving abilities and attention to detail. Ability to handle large datasets and work with data in various formats (structured and unstructured).
Experience with data modeling, ETL processes, and database optimization techniques is a plus.
Familiarity with data warehousing concepts and modern cloud data architectures.
Prior experience in client-facing roles is a plus.
Good to know: Python programming for data manipulation, automation, or additional scripting tasks.
Technologies you'll use in this role (not limited to):
AWS Glue, Lambda, Athena, S3, Redshift/Postgres, ETL, SQL, Python/R, Agile, CI/CD
#DMX
#LI-CC1
#Indeed
Working at ICF
ICF is a global advisory and technology services provider, but we’re not your typical consultants. We combine unmatched expertise with cutting-edge technology to help clients solve their most complex challenges, navigate change, and shape the future.We can only solve the world's toughest challenges by building an inclusive workplace that allows everyone to thrive. We are an equal opportunity employer, committed to hiring regardless of any protected characteristic, such as race, ethnicity, national origin, color, sex, gender identity/expression, sexual orientation, religion, age, disability status, or military/veteran status. Together, our employees are empowered to share their expertise and collaborate with others to achieve personal and professional goals. For more information, please read our EEO & AA policy.
Reasonable Accommodations are available, including, but not limited to, for disabled veterans, individuals with disabilities, and individuals with sincerely held religious beliefs, in all phases of the application and employment process. To request an accommodation please email Candidateaccommodation@icf.com and we will be happy to assist. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. Read more here: Requesting an Accommodation for the ICF interview process.
Read more about workplace discrimination rights, the Pay Transparency Statement, or our benefit offerings which are included in the Transparency in (Benefits) Coverage Act.
Pay Range - There are multiple factors that are considered in determining final pay for a position, including, but not limited to, relevant work experience, skills, certifications and competencies that align to the specified role, geographic location, education and certifications as well as contract provisions regarding labor categories that are specific to the position.
The pay range for this position based on full-time employment is:
$69,862.00 - $118,765.00Nationwide Remote Office (US99)Tags: Agile Architecture Athena AWS AWS Glue Big Data CI/CD Computer Science Data quality Data visualization Data Warehousing EC2 ECS Engineering ETL Jenkins Lambda Pipelines PostgreSQL Python R Redshift SageMaker SAS Snowflake SQL UX
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.