Data Integration Specialist - Department of Homeland Security
Washington, DC, US
Full Time Mid-level / Intermediate USD 85K - 150K
The Leading Niche
Description
**Position Overview**
The Department of Homeland Security (DHS) seeks a **Data Integration Specialist** to lead the unification of siloed data systems for the **Business Resource Integration Dashboard for Government Efficiency (BRIDGE)** initiative. This role will architect and deploy scalable data pipelines, ensuring seamless interoperability across financial, HR, procurement, and operational systems. The ideal candidate will bridge legacy platforms with modern AI-driven analytics while enforcing federal compliance standards.
**Key Responsibilities**
**1. Data Pipeline Development & Automation**
- Design and implement **ETL/ELT workflows** to ingest, cleanse, and harmonize data from 100+ disparate sources (e.g., SAP, ServiceNow, custom databases).
- Automate data validation rules to ensure accuracy for **budget execution, contract performance, and workforce analytics**.
- Optimize pipelines for performance (e.g., Apache Spark, Databricks) and compliance with **NIST SP 800-53, FISMA, and Privacy Act**.
**2. System Interoperability**
- Develop APIs and middleware to connect legacy systems with cloud platforms (**AWS, Azure**) and the BRIDGE analytics environment.
- Map data lineages and metadata schemas to enable cross-domain reporting (e.g., procurement spend vs. HR staffing levels).
- Partner with IT teams to enforce **FedRAMP** standards for cloud integrations.
**3. Governance & Compliance**
- Document data flows, security controls, and audit trails for **A-123, CFO Act, and GAAP** requirements.
- Conduct **FISMA-compliant** vulnerability scans on integrated systems.
- Train stakeholders on data stewardship protocols and access governance.
**4. Emerging Technology Adoption**
- Pilot **AI/ML models** (e.g., anomaly detection in financial transactions) using integrated datasets.
- Recommend modernization roadmaps to replace manual processes (e.g., contract oversight tracking).
Requirements
Education: Bachelor’s degree in Computer Science, Information Systems, or related technical field.
- Experience:
- 5+ years in data integration, including ETL development and API management.
- Expertise in SQL, Python, Java, and integration tools (e.g., Informatica, MuleSoft).
- Proven track record with federal data standards (NARA, FISMA, GAAP).
Tags: APIs AWS Azure Computer Science Databricks Data pipelines ELT ETL Informatica Java Machine Learning ML models Pipelines Privacy Python Security Spark SQL
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.