Senior Data Engineer (Python) Remote
Nationwide Remote Office (US99)
ICF
We make big things possible for our clients. We provide data, insights, and deep implementation expertise they need to deliver results that matter.
Our Health Engineering Systems (HES) team works side by side with customers to articulate a vision for success, and then make it happen. We know success doesn't happen by accident. It takes the right team of people, working together on the right solutions for the customer. We are looking for a seasoned Senior Data Engineer who will be a key driver to make this happen.
Responsibilities:
Design, develop, and maintain scalable data pipelines using Spark, Hive, and Airflow
Develop and deploy data processing workflows on the Databricks platform
Develop API services to facilitate data access and integration
Create interactive data visualizations and reports using AWS QuickSight
Builds required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies
Monitor and optimize the performance of data infrastructure and processes
Develop data quality and validation jobs
Assembles large, complex sets of data that meet non-functional and functional business requirements
Write unit and integration tests for all data processing code
Work with DevOps engineers on CI, CD, and IaC
Read specs and translate them into code and design documents
Perform code reviews and develop processes for improving code quality
Improve data availability and timeliness by implementing more frequent refreshes, tiered data storage, and optimizations of existing datasets
Maintain security and privacy for data at rest and while in transit
Other duties as assigned
Minimum Qualifications:
Bachelor's degree in computer science, engineering or related field
7+ years of hands-on software or data development experience (5 years with Masters)
4+ years of data pipeline experience using Python, Java and cloud technologies
Candidate must be able to obtain and maintain a Public Trust clearance
Candidate must reside in the US, be authorized to work in the US, and work must be performed in the US
Must have lived in the US 3 full years out of the last 5 years
Preferred Qualifications:
Experienced in Spark and Hive for big data processing
Experience building job workflows with the Databricks platform
Strong understanding of AWS products including S3, Redshift, RDS, EMR, AWS Glue, AWS Glue DataBrew, Jupyter Notebooks, Athena, QuickSight, EMR, and Amazon SNS
Familiar with work to build processes that support data transformation, workload management, data structures, dependency and metadata
Experienced in data governance process to ingest (batch, stream), curate, and share data with upstream and downstream data users.
Experienced in data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
Demonstrated understanding using software and tools including relational NoSQL and SQL databases including Cassandra and Postgres; workflow management and pipeline tools such as Airflow, Luigi and Azkaban; stream-processing systems like Spark-Streaming and Storm; and object function/object-oriented scripting languages including Scala, C++, Java and Python.
Familiar with DevOps methodologies, including CI/CD pipelines (Github Actions) and IaC (Terraform)
Ability to obtain and maintain a Public Trust; residing in the United States
Experience with Agile methodology, using test-driven development.
Job Location: This position requires that the job be performed in the United States. If you accept this position, you should note that ICF does monitor employee work locations and blocks access from foreign locations/foreign IP addresses, and also prohibits personal VPN connections.
Working at ICF
ICF is a global advisory and technology services provider, but we’re not your typical consultants. We combine unmatched expertise with cutting-edge technology to help clients solve their most complex challenges, navigate change, and shape the future.We can only solve the world's toughest challenges by building an inclusive workplace that allows everyone to thrive. We are an equal opportunity employer, committed to hiring regardless of any protected characteristic, such as race, ethnicity, national origin, color, sex, gender identity/expression, sexual orientation, religion, age, disability status, or military/veteran status. Together, our employees are empowered to share their expertise and collaborate with others to achieve personal and professional goals. For more information, please read our EEO & AA policy.
Reasonable Accommodations are available, including, but not limited to, for disabled veterans, individuals with disabilities, and individuals with sincerely held religious beliefs, in all phases of the application and employment process. To request an accommodation please email Candidateaccommodation@icf.com and we will be happy to assist. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. Read more here: Requesting an Accommodation for the ICF interview process.
Read more about workplace discrimination rights, the Pay Transparency Statement, or our benefit offerings which are included in the Transparency in (Benefits) Coverage Act.
Pay Range - There are multiple factors that are considered in determining final pay for a position, including, but not limited to, relevant work experience, skills, certifications and competencies that align to the specified role, geographic location, education and certifications as well as contract provisions regarding labor categories that are specific to the position.
The pay range for this position based on full-time employment is:
$84,533.00 - $143,706.00Nationwide Remote Office (US99)Tags: Agile Airflow APIs Athena AWS AWS Glue AWS Glue DataBrew Azkaban Big Data Cassandra CI/CD Computer Science Databricks Data governance Data pipelines Data quality DevOps Engineering GitHub Java Jupyter NoSQL Pipelines PostgreSQL Privacy Python QuickSight Redshift Scala Security Spark SQL Streaming TDD Terraform
Perks/benefits: Health care
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.