CMMI—A Progress Report

NEWS AT SEI

Author

Mike Phillips

This library item is related to the following area(s) of work:

CMMI
Process Improvement

This article was originally published in News at SEI on: January 1, 2005

My last column focused on where the CMMI product suite is headed for the next version. Since then, a SCAMPI Lead Appraisers and CMMI Instructors Workshop and the CMMI Technology Conference and User Group have provided insights on a variety of topics that warrant further discussion in this issue. I’ve organized the topics by the session where the topic was most specifically addressed.

SCAMPI Lead Appraisers and CMMI Instructors Workshop

Progress on the New Introduction to CMMI (Staged and Continuous) Course

There was a beta delivery of the new Introduction to CMMI (Staged and Continuous) course at the SEI in September. At the SEI Partner Workshop in October, SEI Partners were given a summary of the differences between this new course and the Introduction to CMMI courses that cover only one representation. Authorized Introduction to CMMI instructors who attended the workshop’s Instructor Upgrade Course were the first to be approved to offer the new course.
Feedback from these instructors provided a needed adjustment in the schedule for offering the single-representation Introduction to CMMI courses. The new course is designed to be used with the Addison-Wesley book, CMMI: Guidelines for Process Integration and Product Improvement, which covers both representations. Partners believed that using the book may be preferable in some locations, but may be difficult in other locations. Since this concern must be addressed before use of the book can be assured for all course offerings, SEI Partners will continue offering the staged or the continuous courses until this difficulty is resolved. Partner contracts have already been modified to allow the delivery of all three courses through 2005. By late FY06, the V1.2 Introduction to CMMI course update will be available, as will an SEI technical report that includes both representations.

The Introduction to CMMI (Staged and Continuous) course was first offered publicly before the CMMI Technology Conference and User Group in Denver, Colorado. Feedback from that offering was positive and reflected a growing recognition that the two concepts of improvement are are complementary. (There’s more on the complementary aspects of the staged and continuous representations later in this column.) As we make the transition to the new course, we will continue to offer opportunities for authorized instructors to attend upgrade training that covers the differences between the new course and the single-representation courses. Beginning in February, all new instructors will be trained in the Introduction to CMMI (Staged and Continuous) course, as that is our direction for the future.

The SEI Code of Professional Conduct

The importance of professional standards of behavior in the delivery of SEI products and services has led to the creation of a Code of Professional Conduct. The SEI Partner Workshop allowed for extensive discussions on this topic with the SEI Partners responsible for both CMMI training and appraisals. The Code of Professional Conduct applies to all SEI Partners, SEI Authorized and Certified Professionals and Candidates, as well as SEI personnel who deliver any of the products and services available through the SEI Partner Network. (Technologies such as TSP/PSP and OCTAVE are offered as part of the SEI Partner Network as well as CMMI.) SEI Partner contracts now include a commitment to the Code of Professional Conduct for all SEI Partners. All SEI-Authorized CMMI instructors, SCAMPI Lead Appraisers, and candidate instructors and lead appraisers have committed to institutionalizing this professional standard. The Code can be viewed at http://www.sei.cmu.edu/partners.

SCAMPI and ISO

SCAMPI Lead Appraisers often find themselves assisting organizations that must also measure performance against various ISO standards. Two of the ISO standards that were discussed at the workshop were ISO 9001:2000 and ISO 15504. In the first case, organizations who are committed to quality improvement wish to increase the value of the appraisal effort by satisfying both CMMI and ISO 9001:2000 in a combined effort. Opportunities to explore joint appraisals, recognizing the specific needs of both ISO certifying bodies for ISO audit certificates and the SEI for authorized SCAMPI appraisals were discussed.

The ISO 15504 standard requires both appraisal methods and improvement models. Here the focus of effort is to assure that CMMI performance measured by SCAMPI appraisals meets the ISO 15504 requirements so that the results can be used in development domains such as the automotive industry. Currently the European automobile industry is considering an approach called AutoSPICE. Some companies in that industry, companies using the CMMI Product Suite as their process improvement approach, want to be sure that their efforts are understood in comparison with ISO 15504 results using other models. The potential for pilot appraisals was discussed on this theme. Attendees expressed the wish to be sure that there are no impediments that would prevent SCAMPI appraisals, using the CMMI model, to depict results that satisfy the 15504 standard, preferably in organizations seeking to harmonize their CMMI results with other improvement approaches.

CMMI Technology Conference and User Group

Maturity Levels and Performance

In his lunchtime keynote address to the workshop attendees, Mark Schaeffer, the DoD sponsor for CMMI and Director, Systems Engineering for the Office of the Secretary of Defense, praised the outstanding impact that CMMI provides in broadening the approach for process improvement to a unified approach for all DoD software-intensive systems. However, he also noted that his office sees too many organizations focusing on maturity levels rather than on real improvement: “When achieving a level replaces the focus on continuous improvement, we’ve lost sight of the goal.” He held that appraisal results are too often an end in themselves instead of meeting the government’s expectations for actual performance on complex, software-intensive system development. Schaeffer continued by saying, “The government expects that if you have achieved high maturity, then the next program will perform at that maturity.” He believes that the information available to the acquiring organization seen with the continuous approach – the profile of capabilities against all of the relevant development areas of concern – is more useful than broad statements of organizations with a single-digit maturity-level claim.

Schaeffer indicated that his office would be examining ways of encouraging government use of the continuous (capability-level) appraisal results. As I mentioned above, Mr. Schaeffer’s statements are well timed with the development of V1.2 and a “single-book, single-course” approach that assures that the value of the continuous representation is seen by those who may be more familiar with the maturity-level legacy from the Software CMM. Responses from attendees at the first public offering of the new Introduction to CMMI course indicated that a number of the attendees gained a new appreciation for the ability of the continuous representation to enhance capabilities based on business-critical areas as well as the ability of the staged representation to further organizational maturity.

The SEI is providing more information about appraisal results than ever before at http://seir.sei.cmu.edu/pars/. Although the majority of organizations that have agreed to have their information made publicly available (as of this writing, about 130) chose to use a staged appraisal that would result in a maturity level, the depiction of performance is shown for all CMMI process areas.

The continuous appraisals performed differ only in adding the capability level achieved in each of the process areas. (10 of these results have been made publicly available on the PARS site.) For the vast majority of the mid-level appraisals—those emphasizing the maturity level 2 and 3 process areas (i.e., managed and defined)—there are no real differences in the results. The reason for this is that the capability level achieved in each of the process areas matches the staged level requirements for satisfaction of the relevant generic goals. For a staged appraisal, the process area is described as “satisfied.” For a continuous appraisal, the depiction would be “capability level 2” for staged maturity level 2 equivalent appraisals and “capability level 3” for appraisals that are staged at maturity level 3.

Mr. Schaeffer’s comments about the continuous approach are similar to those we have been receiving from the Australian Defence Materiel Organization’s pioneering work with defense contractors in its country. Its acquisition organization has focused on a broad understanding of capabilities seen within Australia’s supply chain rather than seeking maturity levels.

Mr. Schaeffer’s points about performance expectations also reflect the SEI’s longstanding interest in depicting—and emphasizing—performance results rather than maturity level achievement. Our report on CMMI achievement shows the performance benefits of the commitment to and investment in CMMI-based process improvement. We are also researching better ways to link process capability as measured by appraisal results with performance improvement in projects and organizations.

CMMI Coverage for Service Organizations

In my last column, I described an adjustment to our architectural approach for CMMI models that would allow future “constellations” of CMMI models that address domains often adjacent to the development effort. At the CMMI Technology Conference and User Group, the industry chair of the CMMI Steering Group announced that a team was forming to develop the first set of these related models. The plan is to maximize the common model elements while recognizing the unique practices of—and the differences among—organizations that provide or deliver services and those that develop products and services. We anticipate that the major difference between this new constellation and the existing development constellation will be within the Engineering process areas. If we find that there is terminology that should be adjusted in the existing CMMI development models to assure commonality, we will seek to make any such changes in the V1.2 revisions.

Northrop Grumman Corporation has offered to provide the resources to develop a prototype of the Services constellation in 2005. The development project will be performed within the existing CMMI Product Team structure. The CMMI for Services team will proceed on a parallel path with the V1.2 effort. We currently envision that a CMMI-Services model could be released a few months after the V1.2 update, currently scheduled for the latter part of FY 2006. I’ll be providing more information as this effort takes shape.

Conference Topic Coverage

For the benefit of those unable to attend the CMMI Technology Conference and User Group, I want to provide you with a sense of what was presented over the four days. Tutorials included
  • the CMMI’s increased coverage of systems engineering
  • the expanded family of SCAMPI appraisals
  • the power of linking the Team Software Process with CMMI deployment
  • moving from the Software CMM to CMMI
  • adopting CMMI in small organizations
  • using process simulations for continuous improvement
  • managing technical people
  • balancing agility and discipline

Tracks with multiple presentations included

  • CMMI and Process Improvement
  • CMMI Extensions
  • CMMI for Small Projects and Organizations Appraisals
  • High Maturity
  • Practical Guidance
  • Return on Investments
  • Six Sigma
  • Systems Engineering
  • Tools
  • Transitioning to CMMI

Several companies expressed a focus on mission assurance and mission success. This theme crosses CMMI process-area boundaries to encompass process areas such as PPQA, RSKM, SAM, ISM, PI, VER and VAL. This attention is driven by mission criticalities most easily seen with a single missile launch, either to position a satellite in orbit or as part of missile defense. This area appears to be one of growing interest and will provide an opportunity for SEI technical notes or at least a future column dedicated to the topic.

SCAMPI Appraisals for Source Selection and Contract Monitoring

One of the key elements of progress in CMMI appraisals was the DoD’s direction to include the external evaluation mode—for source selection and contract monitoring—within the SCAMPI method. This became available with the V1.1 release of SCAMPI, which was accompanied by a guide for use in such benchmark (Class A) appraisals. Earlier approaches, such as the Software Capability Evaluation V3.0, encouraged risk-based evaluations of key areas of concern for a program. With the new Class B and C appraisals, renewed interest in this approach has been evident. Concerns about the SCAMPI Class A approach were twofold: (1) the time required to visit multiple contractor sites and accomplish a full Class A appraisal (typically a staged maturity level 3) would be at least a week for each site, and (2) the maturity level focus, as noted above, was often not well tuned to the program of interest. However, with the interest in the faster, but potentially less rigorous, Class B SCAMPI comes a legitimate concern from the organizations receiving the appraisal for obvious reasons. One approach that merits brief discussion here is a three-phased approach to appraisals.

The first phase is to use a risk based appraisal, as mentioned by Mr. Schaeffer, for the source-selection element. A subset of CMMI process areas might be chosen for investigation across the proposing contractors. This approach would look for strengths or weaknesses in the development processes that would relate to risks to the proposed development effort. Satisfying CMMI goals, process areas, or maturity or capability levels would not be the point—only risks to the future system’s development.

With this approach to source selection, the CMMI Product Suite would become one of the program office’s tools for reducing risk in the program, rather than a leveling tool, where all competitors must have a given a maturity level or capability levels to compete. Since levels are not part of the source-selection decision, this approach would encourage process improvement under the CMMI collection of best practices without the “gotta show a level” strategies.

Recall that the greater concerns are about how process discipline will be applied in a new program—not how it was applied in previous ones. Often the new program is an amalgam of multiple contractors at multiple locations with a mixture of experienced professionals and new hires. So the second phase in the appraisal process is to conduct a baseline appraisal once the full team is established. Again, no maturity-level results from any of the contributing contractors are of any particular importance. What is important is the establishment of a clear understanding of the process framework for the program. While this may involve a commitment to common processes, the decision must be considered carefully by the team (i.e., a process trade study). Again, a prudent focus is to use a risk-based analysis of the process strengths and weaknesses, establishing action plans for future checks.

This in turn leads to the third phase, the contract-monitoring phase. This is the periodic checking of progress against the risks catalogued in the baseline phase. Often award-fee provisions can provide desirable incentives for progress. In one prototype use of this approach, the government lead appraiser noted that 41 risks were significantly mitigated within 18 months on a complex multi-contractor DoD intelligence program. On follow-on programs using this approach, some are including the government element of the team in the appraisals.
Note that none of these phases require benchmark appraisals, but they do require healthy customer-supplier relationships and effective contract structures. This represents a mature approach to software-intensive systems acquisition.

Summary

I’ve sought to give you some insights into a wide variety of the recent activities involving elements of the CMMI Product Suite. Please continue to join the 50,000 folks who visit our CMMI Web site at each day as we seek to continuously improve these tools for your use!

About the Author

Mike Phillips is the Director of Special Projects at the SEI, a position created to lead the Capability Maturity Model Integration (CMMI) project for the SEI. He was previously responsible for transition-enabling activities at the SEI.

Prior to his retirement as a colonel from the Air Force, he managed the $36B development program for the B-2 in the B-2 SPO and commanded the 4950th Test Wing at Wright-Patterson AFB, OH. In addition to his bachelor’s degree in astronautical engineering from the Air Force Academy, Phillips has masters degrees in nuclear engineering from Georgia Tech, in systems management from the University of Southern California, and in international affairs from Salve Regina College and the Naval War College.

Please note that current and future CMMI research, training, and information has been transitioned to the CMMI Institute, a wholly-owned subsidiary of Carnegie Mellon University.

Find Us Here

Find us on Youtube  Find us on LinkedIn  Find us on twitter  Find us on Facebook

Share This Page

Share on Facebook  Send to your Twitter page  Save to del.ico.us  Save to LinkedIn  Digg this  Stumble this page.  Add to Technorati favorites  Save this page on your Google Home Page 

For more information

Contact Us

info@sei.cmu.edu

412-268-5800

Help us improve

Visitor feedback helps us continually improve our site.

Please tell us what you
think with this short
(< 5 minute) survey.