Data Engineer

Omaha, NE

Barbaricum

All-inclusive government contracting firm that develops innovative strategies

View all jobs at Barbaricum

Apply now Apply later



Barbaricum is a rapidly growing government contractor providing leading-edge support to federal customers, with a particular focus on Defense and National Security mission sets. We leverage more than 15 years of support to stakeholders across the federal government, with established and growing capabilities across Intelligence, Analytics, Engineering, Mission Support, and Communications disciplines. Founded in 2008, our mission is to transform the way our customers approach constantly changing and complex problem sets by bringing to bear the latest in technology and the highest caliber of talent.   Headquartered in Washington, DC's historic Dupont Circle neighborhood, Barbaricum also has a corporate presence in Tampa, FL, Bedford, IN, and Dayton, OH, with team members across the United States and around the world. As a leader in our space, we partner with firms in the private sector, academic institutions, and industry associations with a goal of continually building our expertise and capabilities for the benefit of our employees and the customers we support. Through all of this, we have built a vibrant corporate culture diverse in expertise and perspectives with a focus on collaboration and innovation. Our teams are at the frontier of the Nation's most complex and rewarding challenges. Join us.  
Barbaricum is seeking a Data Engineer to provide support an emerging capability for the USSTRATCOM J2 at Offutt Air Force Base near Omaha, Nebraska. This individual will work to migrate existing ad hoc data flows to JWICS AWS available to the enterprise. Initially, the Data Engineer will use Python to automate data gathering and data cleaning efforts. Using these foundational efforts, the Data Engineer will then develop, implement, and operate a data management system for the intelligence enterprise. Due to security requirements, this position is primarily required to be performed on-site. However, subject to project and customer requirements, team members may be provided flexibility for limited remote support.

Responsibilities

  • Design, implement, and operate data management systems for intelligence needs
  • Use Python to automate data workflows
  • Design algorithms databases, and pipelines to access, and optimize data retrieval, storage, use, integration and management by different data regimes and digital systems
  • Work with data users to determine, create, and populate optimal data architectures, structures, and systems; and plan, design, and optimize data throughput and query performance
  • Participate in the selection of backend database technologies (e.g. SQL, NoSQL, etc.), its configuration and utilization, and the optimization of the full data pipeline infrastructure to support the actual content, volume, ETL, and periodicity of data to support the intended kinds of queries and analysis to match expected responsiveness
  • Assist and advise the Government with developing, constructing, and maintaining data architectures
  • Research, study, and present technical information, in the form of briefings or written papers, on relevant data engineering methodologies and technologies of interest to or as requested by the Government
  • Align data architecture, acquisition, and processes with intelligence and analytic requirements
  • Prepare data for predictive and prescriptive modeling deploying analytics programs, machine learning and statistical methods to find hidden patterns, discover tasks and processes which can be automated and make recommendations to streamline data processes and visualizations
  • Design, implement, and support scalable data infrastructure solutions to integrate with multi heterogeneous data sources, aggregate and retrieve data in a fast and safe mode, curate data that can be used in reporting, analysis, machine learning models and ad-hoc data requests
  • Utilize Amazon Web Services (AWS) hosted big data technologies to store, format, process, compute, and manipulate data in order to draw conclusions and make predictions

Qualifications

  • Active DoD Top Secret clearance required
  • 8+ years of demonstrated experience in software engineering
  • Bachelor’s degree in computer science or a related field. A degree in the physical/hard sciences (e.g., physics, chemistry, biology, astronomy), or other science disciplines (i.e., behavioral, social, and life) may be considered if it includes a concentration of coursework (typically 5 or more courses) in advanced mathematics and/or other relevant experience
  • 8+ years of experience working with AWS big data technologies (S3, EC2) and demonstrate experience in distributed data processing, Data Modeling, ETL Development, and/or Data Warehousing
  • Demonstrated mid-level knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations
  • 3+ years of experience using analytical concepts and statistical techniques
  • 8+ years of demonstrated experience across Mathematics, Applied Mathematics, Statistics, Applied Statistics, Machine Learning, Data Science, Operations Research, or Computer Science especially around software engineering and/or designing/implementing machine learning, data mining, advanced analytical algorithms, programming, data science, advanced statistical analysis, artificial intelligence

Preferred Experience

  • ArcGIS expertise
  • Experience using Python’s NumPy
  • Familiar with git-based revision control
  • Familiar with DevSecOps analytics development
Additional Information
For more information about Barbaricum, please visit our website at www.barbaricum.com.  We will contact candidates directly to schedule interviews.  No phone calls please.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Architecture AWS Big Data Biology Chemistry Computer Science Data management Data Mining Data Warehousing EC2 Engineering ETL Git Machine Learning Mathematics ML models NoSQL NumPy Physics Pipelines Python Research Security SQL Statistics Testing

Region: North America
Country: United States

More jobs like this