Cloud Data Engineer
6314 Remote/Teleworker US, United States
Full Time Senior-level / Expert Clearance required USD 104K - 189K
Leidos
Leidos is an innovation company rapidly addressing the world's most vexing challenges in national security and health. Our 47,000 employees collaborate to create smarter technology solutions for customers in these critical markets.Looking for an opportunity to make an impact?
Leidos is seeking a seeking a customer experience focused Cloud Data Engineer to work with a team of subject matter experts and developers to design and implement full lifecycle data pipeline services for Azure cloud based data lake, SQL, and NoSQL data stores. As a data engineer, you will translate business requirements to data engineering solutions to support an enterprise scale Microsoft Azure based data analytics and reporting platform. You will support continued maintenance of legacy ETL operations and transition to modern cloud native solutions. Our ideal candidate is mission focused and delivery oriented, and applies critical thinking to create innovative functions and solve technical issues.
Who we are
Leidos is a Fortune 500® technology, engineering, and science solutions and services leader working to solve the world’s toughest challenges in the defense, intelligence, civil, and health markets. Leidos Civil Group helps the government modernize operations with leading edge AI/ML driven data management and analytics solutions. We are a trusted partner to both government and highly-regulated commercial customers looking for transformative solutions in mission IT, security, software, engineering, and operations. We work with our customers including the FAA, DOE, DOJ, NASA, National Science Foundation, Transportation Security Administration, Custom and Border Protection, airports, and electric utilities to make the world safer, healthier, and more efficient.
This is a fully remote, telework position with potential travel to the Washington D.C. metro area on special occasions.
In this role, you will:
- Maintain and operate legacy ETL processes utilizing Microsoft SSIS, SQL, and other technologies.
- Implement full lifecycle Azure cloud native data pipeline development and operation.
- Work closely with client personnel and team members to understand data requirements and develop appropriate data solutions.
- Support the design and implementation of data models and data pipelines for relational, dimensional, data lakehouse (medallion architecture), data warehouse, data mart, SQL and NoSQL data stores.
- Utilize Microsoft Azure services including Azure Data Factory, Synapse Pipelines, Apache Spark Notebooks, Python, SQL, stored procedures to develop high performing data pipelines.
- As appropriate, redevelop or migrate existing SSIS extract, transform, load scripts to Azure Data Factory.
- Identify, create, prepare data required for advanced analytics, visualization, reporting, and AI/ML.
- Implement data migration, data integrity, data quality, metadata management, and data security functions to optimize data pipelines.
- Monitor and troubleshoot data related issues to maintain high availability and performance.
- Implement governance, build, deployment and monitoring to automate platform operation.
- Actively support Agile DevOps process, including Program Increment planning.
- Actively engage in continuous learning to increase relevant skills.
- Maintain strict versioning and configuration control to ensure integrity of data.
For this position, you must possess:
- BS degree in Computer Science or related field and 8+ years or Masters with 6+ years of experience
- 4+ years of experience developing and maintaining ETL processes using Microsoft SSIS, SQL, and related scripting languages.
- 4+ years of experience with more than one of the follow scripting languages: SQL, T-SQL, Python, PySpark
- Experience working with Microsoft database and business intelligence tools, including SQL Server, including stored procedures, SSIS, SSRS, SSAS (cubes), scripting languages including PowerShell
- Experience designing and building ETL/data engineering solutions utilizing various cloud services such as Azure Data Lake Services, Azure Synapse Analytics, Azure Data Factory, Integration Runtime.
- Admin, system level experiences with data management, DB creation, user management/access control, change data capture, ETL package deployment, data modeling, scheduling, debug, monitor, security controls, and O&M aspect for both on-premise and cloud based data stores.
- Experience with data engineering solutions for SQL database, data warehouse, data mart, multi-dimensional models (e.g. SSAS).
- Demonstrated experience in supporting production, testing, integration, and development environments
- Open mindset, ability to quickly adapt new technologies to solve customer problems
- Experience in Agile projects, working with a multi functional team.
- Must be detail oriented, and able to support multiple projects and tasks.
- Demonstrate continuous learning to increase relevant skills.
- Ability to successfully obtain a government-issued Public Trust clearance.
- US Citizenship
Not required, but additional education, certifications, and/or experience are a plus:
- Experience working on Federal government projects
- Experience working with data in the law, HR, financial management, inventory, property, and management domains
- Experience working with Azure DevOps.
- Knowledge and experience of configuring ETL pipelines in cloud, VPC/VNet config, Integration Runtime, Gateway, EC2/Bastion access settings
- Microsoft certification in Azure fundamentals, data engineer, AI or AWS certified data engineer.
Original Posting:
July 10, 2025For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range:
Pay Range $104,650.00 - $189,175.00The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.
Tags: Agile Architecture AWS Azure Business Intelligence Computer Science CX Data Analytics Data management Data pipelines Data quality Data warehouse DevOps EC2 Engineering ETL Machine Learning NoSQL Pipelines PySpark Python Security Spark SQL SSIS Testing T-SQL
Perks/benefits: Career development Equity / stock options Gear
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.