ETL Developer

9615 Ashburn VA Non-specific Customer Site, United States

Leidos

Leidos is an innovation company rapidly addressing the world's most vexing challenges in national security and health. Our 47,000 employees collaborate to create smarter technology solutions for customers in these critical markets.

View all jobs at Leidos

Apply now Apply later

The National Security Sector within Leidos is seeking a highly skilled ETL Developer with expertise in Informatica PowerCenter, Informatica Intelligent Cloud Services (IICS), and Databricks to join our team. The ideal candidate will have extensive experience in designing, developing, and optimizing ETL pipelines, ensuring data integrity, and working with cloud-based data platforms. This role requires a strong background in data warehousing, cloud migrations, and performance tuning. The candidate should have a proven track record of working on complex projects with high data volume transactions. This position supports the Passenger Systems Program Directorate (PSPD) Interface and Support Processes division within Customs and Border Protection (CBP). PSPD supports the Department of Homeland Security (DHS) and CBP critical missions, specifically screening and processing travelers at the ports of entry (POEs) into the United States.

**This position REQUIRES onsite support in Ashburn, VA, 2 times a week**

Key Responsibilities:

  • Design, develop, and maintain ETL processes using Informatica PowerCenter, IICS, and Databricks.

  • Develop and optimize data pipelines to integrate structured and unstructured data from various sources.

  • Work closely with data architects and business analysts to understand data requirements and translate them into scalable ETL solutions.

  • Perform data profiling, quality checks, and implement best practices for data governance.

  • Optimize performance of ETL jobs and data pipelines to ensure efficiency and scalability.

  • Support cloud data migration efforts, including integrating Informatica with cloud platforms (AWS, Azure, or GCP).

  • Troubleshoot and resolve issues related to data integration, transformations, and workflow execution.

  • Document ETL designs, processes, and technical specifications.

  • Handle high data volume transactions efficiently and ensure system scalability.

Required Qualifications:

  • BA/BS with 12+ years of relevant experience or 10+ years of relevant experience with Master's Degree: OR 4 years of experience in lieu of degree

  • 12+ years of experience in ETL development, data integration, and data engineering

  • Must be able to maintain and obtain a CBP Background Investigation prior to start

  • Expertise in Informatica PowerCenter and Informatica IICS (Cloud Data Integration & Application Integration).

  • Hands-on experience with Databricks (PySpark, Delta Lake, Notebooks, Workflows).

  • Strong experience in SQL, stored procedures, and performance tuning (SQL Server, PostgreSQL, Redshift, Snowflake, etc.).

  • Experience working with cloud-based data platforms (AWS, Azure, or GCP).

  • Knowledge of data warehousing concepts, data modeling, and ETL best practices.

  • Familiarity with REST APIs, JSON, and XML data processing.

  • Experience with job scheduling tools like Control-M, Airflow, or similar.

  • Strong problem-solving skills and ability to work independently or in a team environment.

  • Excellent communication skills and ability to collaborate with cross-functional teams.

  • Proven experience working on large-scale, complex projects with high data volume transactions.

Preferred Qualifications:

  • Active CBP BI 

  • Experience with CBP PSPD

  • Experience with Kafka, Spark Streaming, or other real-time data processing technologies.

  • Familiarity with CI/CD pipelines for ETL deployments.

  • Experience with Python, Scala, or Shell scripting for data transformation and automation.

  • Knowledge of Data Governance, Security, and Compliance standards.

Original Posting:

March 28, 2025

For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.

Pay Range:

Pay Range $126,100.00 - $227,950.00

The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.

Apply now Apply later
Job stats:  2  0  0
Category: Engineering Jobs

Tags: Airflow APIs AWS Azure CI/CD Databricks Data governance Data pipelines Data Warehousing Engineering ETL GCP Informatica JSON Kafka Pipelines PostgreSQL PySpark Python Redshift Scala Security Shell scripting Snowflake Spark SQL Streaming Unstructured data XML

Perks/benefits: Equity / stock options

Region: North America
Country: United States

More jobs like this