Data Modeling and Analytics
Rigorous data gathering and analysis methods
Many sources—including sensors, log files, databases, and qualitative means—are producing data at previously unimaginable scales and in unprecedented detail. The volume of data has outpaced the existing ability, in government and elsewhere, to use that data to make decisions, enable system adaptation, and generate input to train machine learning (ML) algorithms.
We apply cutting-edge techniques from academic and commercial data analytics innovation, often through collaboration with experts at Carnegie Mellon University. We develop ML algorithms and train them by mining the ground-truth data sets that we curate as an FFRDC. In that way, we tailor those technologies for the benefit of the Department of Defense, supporting our work in software cost estimation and control, human-machine teaming, autonomy and counter-autonomy, and system verification and validation.
We offer an approach that reduces risk and simplifies the selection and acquisition of big data technologies when you acquire and develop big data systems.
Costs for large new systems are hard to estimate. We developed a method to quantify uncertainty and increase confidence in a program's cost estimate.
August 17, 2016 • Technical Note
This technical note introduces Quantifying Uncertainty in Early Lifecycle Cost Estimation (QUELCE), a method for estimating program costs early in development.Download
May 16, 2016 • Conference Paper
John KleinRoss Buglak (Data to Decisions Cooperative Research Centre)David Blockow (Data to Decisions Cooperative Research Centre)
This paper presents a reference architecture for big data systems that is focused on addressing typical national defense requirements and that is vendor-neutral.Download
December 09, 2015 • Conference Paper
Presented at the 2015 Supercomputing Conference, this paper shows that dynamic parallelism enables relatively high-performance graph algorithms for GPUs.Download
November 18, 2015 • Presentation
This presentation includes a brief demonstration of tools created by SEI staff that help scan, analyze, and prepare data to be used on a weekly metrics report.Download
History of Innovation at the SEI in Data Modeling and Analytics
The SEI has performed innovative research in data modeling and analytics for almost 30 years that has benefited government, industry, and academia. Learn more about a few of the highlights.
Taming Uncertainty in Software Cost Estimation
Early cost estimates rely on expert judgments about cost factors—but cost factors change throughout the program lifecycle. The SEI's approach helps program managers account for these changing factors.Read the Story
Certifying the Software Architect Role
In 2009, the U.S. Army mandated that all Program Executive Offices appoint a chief software architect who had earned the Software Architecture Professional Certificate from the SEI or an equivalent certificate.Read the Story
Defining Non-Functional System Qualities
The idea that quality attributes influence the shape of an architecture and that the architecture is fundamental to a system emerged from 2003 research at the SEI in rate monotonic analysis.Read the Story
Transforming Software Quality Assessment
The SEI's publication of the Software Capability Maturity Model in 1991 provided an objective standard for software development and changed the view in government and industry about software quality.Read the Story