2021 Research Review / DAY 2
Rapid Adjudication of Static Analysis Alerts During Continuous Integration
The DoD has directed a shift toward continuous integration/continuous deployment (CI/CD) to maintain a competitive edge [McMurry 2018]. It is currently standard to run automated unit, integration, and stress tests during CI builds, but static analysis (SA) tools are not always part of builds because CI time frames are too short. However, SA tools could detect code flaws that are cheaper to fix earlier in the development process during CI builds.
It is increasingly common to use multiple SA tools and combine their alerts to maximize the identification of potential security flaws [Delaitre et al. 2018]. However, current SA tools produce some false positive (FP) alerts that require humans to inspect the code and manually adjudicate true vs. false alerts [Heckman 2011]. We use the term alertCondition to designate an alert from a tool mapped to a member of an external taxonomy of conditions (code flaws), for instance, CWE-190 from the CWE taxonomy. If SA is used within CI, alertConditions could stop a build and force human adjudication of true positive (TP) vs. FP, which slows development but might net an acceptable tradeoff if the slowdown is limited and/or occasional. Furthermore, many previously adjudicated FP alerts reappear each time an SA tool is run on a subsequent code version.
This research project will use machine learning and semantic analysis of data generated during CI/CD to reduce the number of alerts requiring human adjudication by 50%.
To maintain development velocity, DoD organizations with a continuous authority to operate (ATO) process have been forced to make tradeoffs in their security development testing and evaluation processes. For example, one organization removed SA tools from the CI/CD process, substituting a more expensive, less agile, and later manual review. Another kept SA tools, but reduced their sensitivity and analyzed only a small subset of the alerts, which introduced false negatives. We take the latter approach as a starting point, our goal being to increase efficiency by automating this process.

This research project will use machine learning and semantic analysis of data generated during CI/CD to reduce the number of alerts requiring human adjudication by 50% in multiple SA tool deployments without slowing the development process. More specifically, this project will
- improve the state of the art in reducing false positives and integrating SA tools into CI/CD processes
- improve the state of the practice by delivering and validating a prototype system that implements the new algorithms and measures the effectiveness of the techniques
In Context
This FY20-22 Project
- builds on a number of previous projects, including “Rapid Construction of Accurate Automatic Alert Handling System: Model & Prototype” and “Running in the Cloud Without Breaking the Bank”
- aligns with the CMU SEI technical objective to make software trustworthy in construction, correct in implementation, and resilient in the face of operational uncertainties, including known and yet unseen adversary capabilities
Principal Investigator
Lori Flynn
Senior Software Security Engineer
SEI Collaborators
Tyler Brooks
MTS, Engineer
Lyndsi A. Hughes
Systems Engineer
Jeffrey Mellon
Machine Learning Research Scientist
Joseph Sible
Associate Software Engineer
David Svoboda
Software Security Engineer
Joseph D. Yankel
MTS, Senior Engineer
Rhonda Brown
Member Of The Technical Staff
Ebonie McNeil
Devops Engineer
Wei-Ren Murray
Software Engineer
Matt Sisk
Member Of The Technical Staff
Dustin Updyke
MTS, Senior Engineer
Mentioned in this Article
[Delaitre et al. 2018]
Delaitre, Aurelien M.; Stivalet, Bertrand C. ; Black, Paul E. ; Okun, Vadim; Cohen, Terry S.; and Ribeiro, Athos. SATE V Report: Ten Years of Static Analysis Tool Expositions. Special Publication (NIST SP)-500-326. 2018.
[Heckman 2011]
Heckman, Sarah & Williams, Laurie. A systematic literature review of actionable alert identification techniques for automated static code analysis. Information and Software Technology. Volume 53. Number 4. 2011. Pages 363-387.
[McMurray 2018]
McMurry, Robert D. & Roper, William B. Establishment of Air Force Program Executive Officer (PEO) Digital. [Memorandum for all AFPEOs.] Washington, D.C., Department of the Air Force. August 29, 2018. https://www.hanscomreps.org/wp-content/uploads/2018/09/20180829-PEO-Digital-Establishment-Memo-Signed.pdf