Data Engineer (Geospatial) (TS/SCI with Poly Required)

Tysons Corner, Virginia, United States

GCI

Founded in 1989, GCI is a premier Engineering and Analytics firm with a steadfast commitment to national security and intelligence. Specializing in Data Analytics, Software Development, Engineering,

View all jobs at GCI

Apply now Apply later

GCI, embodies excellence, integrity and professionalism. The employees supporting our customers deliver unique, high-value mission solutions while effectively leverage the technological expertise of our valued workforce to meet critical mission requirements in the areas of Data Analytics and Software Development, Engineering, Targeting and Analysis, Operations, Training, and Cyber Operations. We maximize opportunities for success by building and maintaining trusted and reliable partnerships with our customers and industry.

At GCI, we solve the hard problems. As a Data Engineer, a typical day will include the following duties:

JOB DESCRIPTION

The Data Engineer will manipulate data and data flows for both existing and new systems. Their previous experience must include a geospatial and telemetry focus. Additionally, they will provide support in the areas of data extraction, transformation and load (ETL), data mapping, data extraction, analytical support, operational support, database support, and maintenance support of data and associated systems.  As a member of the team, candidates will work in a multi-tasking, quick-paced, dynamic, process-improvement environment that requires experience with the principles of large-scale (terabytes) database development, large-scale file manipulation, data modeling, data mapping, data testing, data quality, and documentation preparation.

QUALIFICATIONS

  • Bachelor’s Degree in Computer Science, Electrical or Computer Engineering or a related technical discipline, or the equivalent combination of education, technical training, or work/military experience
  • Minimum eight (8) years of related software engineering and ETL experience

REQUIRED KNOWLEDGE/SKILLS

  • Experience building and maintaining data flows in NiFi, Pentaho, and Kafka
  • Experience with the following languages: Java/J2EE, C, C++, SQL, XML, XQuery, XPath, Python, JSON
  • Experience with data visualization with geospatial data, ESRI, and related technologies (Elastic)
  • Analytic and targeting methodologies related to geospatial data
  • Familiarization with NoSQL datastores
  • Excellent organizational, coordination, interpersonal and team building skills

DESIRED KNOWLEDGE/SKILLS

  • Familiarization executing jobs in Big Data Technologies (i.e., Hadoop or Spark)
  • Knowledge of servers operating systems; Windows, Linux, Distributed Computing, Blade Centers, and cloud infrastructure
  • Strong problem-solving skills
  • Ability to comprehend database methodologies
  • Focus on continual process improvement with a proactive approach to problem solving

KEY RESPONSIBILITIES

  • Research, design, and develop better ways of leveraging geospatial and telemetry data flows in our enterprise-wide systems and/or applications
  • Use Java to manage and improve current NiFi code
  • Troubleshoot Oracle and Elastic datastores in the event of an outage
  • Develop complex data flows, or makes significant enhancements to existing pipelines
  • Resolves complex hardware/software compatibility and interface design considerations
  • Conducts investigations and tests of considerable complexity
  • Provides input to staff involved in writing and updating technical documentation
  • Troubleshoots complex problems and provides customer support for the ETL process
  • Prepares reports on analyses, findings, and project progress
  • Provides guidance and work leadership to less-experienced engineers

*A candidate must be a US Citizen and requires an active/current TS/SCI with Polygraph clearance. 

Apply now Apply later
Job stats:  0  0  0
Category: Engineering Jobs

Tags: Big Data Computer Science Data Analytics Data quality Data visualization Engineering ETL Hadoop Java JSON Kafka Linux NiFi NoSQL Oracle Pentaho Pipelines Python Research Spark SQL Testing XML

Region: North America
Country: United States

More jobs like this