Senior Data Engineer
USA-VA-Ashburn
Full Time Senior-level / Expert Clearance required USD 130K - 241K * est.
General information
Requisition # R61540 Locations USA-VA-Ashburn Posting Date 07/01/2025 Security Clearance Required Public Trust/Suitability Remote Type Hybrid Time Type Full timeDescription & Requirements
Transform the future of federal services with ManTech! Join a vibrant, energetic team committed to enhancing national security and public services through innovative tech. Since 1968, we’ve partnered with Federal Civilian sectors to deliver impactful solutions. Engage in exciting projects in Digital Transformation, Cybersecurity, IT, Data Analytics and more. Ignite your career and drive change. Your journey starts now—innovate and excel with ManTech!ManTech seeks a motivated, career- and customer-oriented Senior Data Engineer to join our innovative team in Ashburn, VA. This is a hybrid position with 2 days onsite and 3 days remote.
Each day U.S. Customs and Border Protection (CBP) oversees the massive flow of people, capital, and products that enter and depart the United States via air, land, sea, and cyberspace. The volume and complexity of both physical and virtual border crossings require the application of “big data” solutions to promote efficient trade and travel.
Responsibilities include but are not limited to:
- Responsible for data analysis of large database tables to understand the data structures, definitions, and patterns which will be used to support various predictive models.
- Data analysis, problem solving, investigation and creative thinking with massive amounts of data supporting a variety of operational scenarios and work closely with the client to assist in managing data needs and supporting collection of data needed for various operational needs.
- Implement cloud techniques and workflows (on-prem to cloud platforms).
- Respond to data queries/analysis requests from various groups within an organization. Create and publish regularly scheduled and/or ad hoc reports as needed.
- Research and document data definitions for all subject areas and primary datasets supporting the core business applications.
- Responsible for source code control using GitLab.
- Demonstrate a strong practical understanding of application-relevant cargo and passenger data and databases used to support analytic application development, functionality and targeting end user (officer) operation.
Minimum Qualifications:
- HS Diploma/GED and 15+ years or AS/AA and 13+ years or BS/BA and 7+ years or MS/MA/MBA and 5+ years or PhD/Doctorate and 3+ years
- Experience in application development/full life cycle on data warehouse engagements and at least 4 years’ experience in large (80TB+) and complex data warehousing architecture, design, and implementation/migration.
- Experience with one or more relational database systems such as Oracle, MySQL, Postgres, SQL server, etc. and experience in Extract-Transform-Load (ETL) development with knowledge of ETL concepts, tools, and data structures
- Experience with cloud platforms like Amazon Web Services (AWS), Microsoft Azure, etc. and migrating customers/projects to the cloud.
- Experience working in Unix/Linux environment
- Experience with shell scripting and scheduling CRON jobs
- Experience with one of the modern Cloud DW like Redshift, Databricks or BigQuery
Preferred Qualifications:
- Experience with NoSQL database (MongoDB or DynamoDB or DocumentDB)
- Database query tunning and other performance enhancement methodology is plus
- Knowledge of Continuous Integration & Continuous Development (CI/CD) tools.
- Ability to multitask efficiently and progressively and work comfortably in an ever-changing data environment and must work well in a team environment as well as independently.
- Possess excellent verbal/written communication and problem-solving skills and ability to communicate information to a variety of groups at different technical skill levels.
- Experience with relational databases and knowledge of query tools and/or BI tools like Power BI or Business Objects (BO) and data analysis tools.
- Experience with the Hadoop ecosystem, including HDFS, YARN, Hive, Pig, and batch-oriented and streaming distributed processing methods such as Spark, Kafka, or Storm.
Clearance Requirements:
- Must be a U.S. citizen with the ability to obtain DHS CBP suitability prior to starting this position.
Physical Requirements:
- The person in this position needs to occasionally move about inside the office to access file cabinets, office machinery, or to communicate with co-workers, management, and customers, which may involve delivering presentations.
ManTech International Corporation considers all qualified applicants for employment without regard to disability or veteran status or any other status protected under any federal, state, or local law or regulation.
If you need a reasonable accommodation to apply for a position with ManTech, please email us at careers@mantech.com and provide your name and contact information.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture AWS Azure Big Data BigQuery CI/CD Data analysis Data Analytics Databricks Data warehouse Data Warehousing DynamoDB ETL Excel GitLab Hadoop HDFS Kafka Linux MongoDB MySQL NoSQL Oracle PhD PostgreSQL Power BI RDBMS Redshift Research Security Shell scripting Spark SQL Streaming
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.