Discussion with Members of the CMM Integration (CMMI) Steering Group



Bill Pollak

This library item is related to the following area(s) of work:

Process Improvement

This article was originally published in News at SEI on: September 1, 1998

Discussion with Members of the CMM Integration (CMMI) Steering Group

Moderated by Bill Pollak

In this article CMMI Steering Group members Philip S. Babel (Aeronautical Systems Center, Air Force Material Command), Joan Weszka (Lockheed Martin Corporation), Hal Wilson ('Net Solutions, Litton PRC),and Michael G. Zsak, Jr. (Office of the Secretary ofDefense, Acquisition & Technology) engage in awide-ranging discussion about the CMMI Project. Theviews expressed in this article are those of theparticipants only and do not represent directly or implyany official position or view of the Software EngineeringInstitute or Carnegie Mellon University. This article isintended to stimulate further discussion about thesetopics.

The systems   engineering context  

Bill Pollak (moderator): A strong emphasis of the CMMI work is the need to   consider software engineering in a systems engineering context. Why do you   think this is so important?

Phil Babel: In the services and the OSD, we are involved in the   development of defense systems, which involve concurrent hardware, software,   and other implementation technologies. Since all of these have to be done in   parallel, some people call it concurrent engineering. The systems engineering   process is really the bigger process that oversees the rest of the   subprocesses, so software engineering could be thought of as a piece that has to   fit inside systems engineering. That’s especially important because the front   end of the software engineering process is initiated by the systems engineering   delivery of a good, solid set of requirements and top-level designs. And on the   back end of the development process, the testing and integration phase is   really a systems engineering process. So software engineering kind of fits in   the middle and runs in parallel with hardware engineering. And I think that’s   why it’s so important that we have an integrated process improvement strategy.

Joan Weszka: I think what we’re all trying to achieve is   enterprise-level process improvement. And to do that, we need to take advantage   of synergies across all the disciplines, including software and systems   engineering, in order to maximize the benefits of process improvement. Rather   than taking a stovepiped, single-discipline approach, taking an integrated,   enterprise-level approach to process improvement should yield much greater   efficiency as well as benefit. A first logical step for those involved in   software systems development is to adopt an integrated process and improvement   strategy for systems and software engineering.

BP: I assume that these conclusions are based on experience   with trying to apply the CMM in organizations.

JW: At Lockheed Martin, we have used the individual,   discipline-specific models, so we understand the cost and benefits gained. We   also have considerable experience in applying integrated product development   (IPD), where we deploy integrated product teams having representation from all   disciplines across the life cycle, and we have some tremendous success stories   in that arena. We expect to see analogous advantages accrue as a result of   using the integrated CMMI models since we'll be exploiting commonality and   leveraging across the disciplines just as we do with IPD.

MIKE ZSAK: I think if you look at the major change we made with   our DoD policy documents back in ’96, we brought all of what used to be the   stand-alone stovepipes, like software engineering and reliability and   maintainability (R&M), under the umbrella of systems engineering. So   actually, our policy architecture is now structured so that software   engineering is one of the engineering disciplines considered as part of systems   engineering. It fits in with configuration management, and R&M, and all of   the other disciplines. There are a number of reasons we did this, the first of   which is efficiency—we had some experience where folks would work their   particular discipline, like R&M or others, and they would optimize their   discipline, but the system would be suboptimized. Also, when you look at what   goes on in software engineering and many other functional disciplines, you find   that they have a lot in common with what is done at the system level—the   general thrust and the objectives are very similar. For example, the approach   used for risk management in software engineering is very similar to the   approach used in systems engineering. You may get into unique implementation   methodology, but the basic approach and concepts of risk management are the   same in both cases.

BP: So there’s an opportunity to exploit those commonalities.

Hal Wilson: Let me shift the perspective to a more general view at   the process level. At Litton PRC, we measured our results and we started to   look at what went well and what didn’t. And we found that for many of the   problems that were occurring in very complex implementations, although the   problems were primarily software driven, they really stemmed from our having stopped   short of a complete, system-wide view. And, from a software perspective, as we   went across the boundaries and embraced more of the system-wide   considerations—when we brought the systems engineering environment together   with our software engineering environment—it became obvious that we needed to   have that wider perspective. Mature organizations that measure and verify   what’s going on and how to improve things find that the stovepiped view of a   pure software or pure systems engineering context is limiting. We found that we   needed to bring the two together. And that’s why we’re so supportive of this   whole CMMI activity.

PB: The key is that we are primarily in the systems   development business, and thus we need integrated systems engineering tools,   including process improvement support tools.

Adapting   the CMMI framework to an organization’s needs

BP: The A-Spec lists eight deliverables from the CMMI   framework that include various combinations of staged and continuous capability   models (CMs), with and without integrated product and process development   (IPPD). How would an organization determine which of these combinations would   best meet its needs?

JW: First and foremost, an organization needs to understand   what its business is. When we talk about process improvement, we have to   consider an organization’s goals and business objectives. So for example, if an   organization is involved only in software engineering, then the software   stand-alone model might suffice. However, if an organization is involved in   systems development, then it makes sense to adopt one of the integrated models.

HW: I think that most of our organizations are large and very   diverse. There are elements within individual corporations that are going to   have different needs. In fact, depending on the type of project your   organization executes, you may need to tailor almost on a project basis. You   may not have a large enough project to do a full IPPD environment; or you may   have a software-only development project and you may want to tailor down your   processes from your full corporate set. The CMMI would allow an element of a   large organization to select a CMMI product that matches the organization’s   need. The product-selection technique would allow you to choose and get a   product that will allow you to make sure that you’re tailoring   correctly—especially to make sure that what you chose as a project-specific   single-discipline environment is compatible with a multi-discipline version   that another part of the organization might have selected. The CMMI product   suite would give you a verification of that. As Joan said, you need to know   what you’re doing, and what your business is, but at the same time, you might   have variations within your business that are facilitated by the ability to   select products within the CMMI.

Staged vs.   continuous representations

BP: What about the choice between the staged and continuous   representations?

JW: I think that there will be guidance offered to the   community on how they might decide which representation to use. For example, if   an organization is using a staged model, such as the SW-CMM, then it might   consider using one of the integrated models that is a staged representation.   From a cultural perspective, it might be easier to transition to an integrated   model having a familiar architecture. The fact is that, as in all technology   transition, we have to deal with the cultural issues, which as we all know are   the most difficult ones. If an organization has not yet adopted any of the   legacy discipline-specific models for process improvement, and is just   beginninga   process improvement program, then the continuous model may be a logical choice.   In the longer term, I think the continuous model has a lot more adaptability,   flexibility, and room for growth, and facilitates extending the scope of a   process improvement program.

BP: What is behind the decision to preserve both the staged   and continuous representations of process development? Wouldn’t it be simpler   to choose one or the other?

HW: I think there’s a characteristic of staged and continuous   that has a lot to do with knowing what you’re getting into. The continuous   model presumes that you know what would be the logical choices for your   organization. One of the values of an early adoption of a staged model is the   fact that it basically tells you how to get there, and shows you what steps to   take. It’s very hard for organizations that are just starting to see their way   clear to make decisions on “Which of these practices should we do first, how do   we approach these, and what level should we be at in each one?” It does require   a mature knowledge of what your business practices are, and what you need for   your business. And I think it’s very comforting for an inexperienced   organization to judge risk, particularly for its management when they are   making an investment without a lot of experience. As organizations mature,   build a process-based culture, and learn howprocesses get adopted and adapted   within their organizations, the tendency will be to move from a staged view to   a more continuous one. Once the mechanisms become embedded in the organization,   they are part of the organization’s culture. So I think that although the   tendency has been that organizations new to the game are more comforted by a   staged model, those of us who have been at it a while would say that the   continuous model has more validity in the long term.

PB: I believe many engineering professional representatives   of the medium and large systems development companies, who are familiar with   multi-level staged models, see the advantages of continuous representation   models. These engineers fully understand the implications of sequenced process   improvement and have the experience and knowledge to make the appropriate   choices as to what to improve and in what order, consistent with their business   objectives. I think another part of the answer to this question is that   although it might have been simpler to choose one or the other representation   for the CMMI framework, it wouldn’t necessarily have been the better or more   effective choice. And I think that’s where we’ve had a lot of discussion, a lot   of different points of view, to suggest that we should implement and support   both. And even though it’s more difficult, we would have a lot more to offer in   the long run. I think it’s the right thing to do.

HW: I think you’re right, Phil. And while it would be simpler   to choose one or the other, we risk disenfranchising a portion of the industry   if we do. That would have a much greater impact than to take the effort to do   both.

PB: I think the other point, in case it’s not obvious, is   that we’re bringing together two models, systems engineering and software   engineering, which exist in the two different representations, continuous   (systems engineering) and staged (software engineering).

Investments   in previous CMMs

BP: A major concern of users is likely to be the protection   of legacy investments in implementing previous CMMs. How is the CMMI effort   responding to these concerns?

HW: I’m not even sure that’s the right question in the sense   of legacy. I know everybody’s concerned with preserving their legacy, but I   don’t think that means restricting change. At Litton PRC we’ve been bringing   together all of our software and systems engineering into a single integrated   set over the last two and a half or so years. We predated the activity of the   CMMI, and naturally made some missteps along the way. But the issue, I think,   is not so much impacting or protecting the legacy, but rather improving the   legacy. And when you begin to look at the process of continuous improvement,   you know that some adjustments will be made to gain improvement. Starting as we   did with the SW-CMM, which was a staged model, and integrating with it the   continuous aspects of the systems engineering model, we had to make choices   individually on the various practices. We had to decide how to make that come   together just as the product development team (PDT) is now doing on the CMMI   effort. Unfortunately, we had to do that without the help of the rest of the   industry. So the CMMI effort is really trying to bring that together for   organizationsto keep the impact of having to make those decisions to a   minimum—so when the PDT does make those decisions, companies won’t be at odds   with the rest of the industry. Companies will have a way to validate that they   are indeed doing the correct things. The reason it takes so long, when you’re   doing it from your own perspective, is that you’re really trying to determine   where this will be in the future. When no one else is giving you input, it’s   very difficult to do it by yourself. That’s the value of a technical legacy,   because if you’re going to improve your environment and your current   disciplines, you’re going to have to make some choices. If you’re not confident   that the choices you’re making are in the general direction that the industry   is going, you may be in a backwater before you know it. So the CMMI is actually   the way to protect your legacy and make sure that your organization will   proceed the way that the industry is going.

MZ: We’ve heard this question a lot, and I still struggle   with it. You have to be careful about your definition of ”protecting legacy.”   If your definition is that what you’ve done in the past is going to be just as   applicable in the future, that there’s not going to be any change, then the   answer is, we’re not protecting the legacy, because there is going to be change. Now what we’ve tried to do is to take into   consideration what changes are going to occur, and make sure that the changes   are worthwhile. We’ve been very clear in saying that when we come out with the   software version of the CMM, it will not be identical to Version 2.0 that was   in the draft stage. We are taking steps to provide the audit trail we’ve taken   from the old Version 2.0 to the version where we eventually wind up. But, if   protecting legacy means not making any changes, then we’re not going to protect   legacy. If you’re going to improve, you’re going to make changes.

JW: I think one of the key enablers to protecting legacy   investments is the decision that we made to provide both the continuous and   staged model representations. I certainly agree with the point that, if an   organization is currently using the SW-CMM or one of the systems engineering   models, there will likely be changes involved when a CMMI model is adopted. The   changes may be due to additional or expanded practices in the CMMI model. The   community has come to expect that there will be evolutionary changes and model   improvements, and expects explanations and mappings to trace what has changed   over the previous versions. You will find the same representations or   architectures used in the CMMI models that were used in the CMMI source models   (the SW-CMM, SECM, and IPD-CMM). Thus, organizations will be able to adopt the   CMMI models in the representation (staged or continuous) they’re using. This   will allow legacy investment to be preserved. Over time, when an organization   is ready to transition from whatever representation it's using to another,   benchmarking can be used to facilitate the transition. We’ve said that the CMMI   assessment method will provide consistent results across the two   representations. So, if you’re currently using a staged representation, and you   conduct an appraisal, you could conduct another appraisal using a continuous   model and compare the results. So, as you transition from one representation to   another, the impact, if any, will be very clear. An important point to note is   that the input models to the CMMI effort are the existing legacy models   contained in the source documents of the A-Spec. Had the CMMI project   started with a clean sheet of paper, there would have been no notion of   preserving legacy investment in process improvement. I think our A-Spec   approach requiring use of legacy models is another indication of what we’re   trying to do to preserve legacy investment.

HW: I think that’s an excellent point. It’s the assumption   that we’ve made through all of this. When we speak to the legacy, my concept of   legacy is really the organization’s processes and practices that they have   built their organization and their maturity upon. That’s the legacy that you   consistently and constantly improve. So, in that sense, you expect to change   and migrate as you go forward. But you don’t want to throw everything away. I   think that’s been the underlying issue in protection—we start with things that   people are already doing, that have been well established and that they’re   comfortable with, and that the industry itself has formed a consensus about.   And that’s the protection. The improvement is natural, and that’s where I think   the distinction has been made between what you start with as a legacy and what   you would like to retain in terms of concept, procedure, and discipline—your   organizational heritage—and then move forward with that. The big concern that   everyone had was whether the CMMI would diverge completely from where they   were, and I think the answer to that is “no”—it’s going to start with where   they were and move forward.

BP: What about legacy investments in training and tools?

HW: I think if you look at the way organizations incorporate   practice and extend maturity … as they move from levels of maturity forward,   they modify their internal training materials. They don’t necessarily use a   standard set; they improve. And when you look at what the CMMI is doing,   particularly in the core areas, they’re really bringing together separate   training elements that might have to be trained from a different perspective   into one cohesive set. As you move forward, you pick a consolidated systems and   software engineering approach. If you pull one of those products from the CMMI,   whether it be staged or continuous, you will have a consolidation of the   essential elements of both, rather than having two totally separate training   activities, as you would today. And I think if anything, it should minimize the   impact. If you are truly going forward and incorporating the two disciplines,   you should see the benefit immediately. If you’re going to stay within just one   discipline, then you should not see a significant difference. But, in either   case, you will get the essential elements. And I think if there is an impact on   training, it will be on the organizations themselves. They will have to decide   how they take and improve their internal training processes to bring the   software-specific elements and move them slightly into the background, and to   move the core elements forward within the total view. That, I think, is a   tremendous value improvement, rather than a detriment. But the fact is, you still   have to go through it. There is going to be some cost to move forward, even in   going from one level to the next in the current model.

JW: I think we have to keep our eye on the return on   investment from a longer term perspective. So, yes, there may be some impacts   near term, to consolidate training, to integrate tools, and possibly to make   some changes to tools if in fact they don’t integrate. But the real objective   is to realize long-term savings. To achieve the return on investment, which we   expect will be achieved, we may need to invest in the near term. However, we’d   expect to see tremendous payoff in the longer term.

CMMs for   other functional disciplines

BP: What are some other disciplines not currently   incorporated into the CMMI framework that you anticipate being incorporated in   the future?

PB: As you know, what we’re trying to do is develop a   framework and an architecture that allows us to add additional   development-related and enterprise-related disciplines. A couple have been   mentioned in some of our forums. The ones that we’re actually going to add will   come from the real development users and the technology and operational needs,   as they advance and grow. But I think potentially, some of them might be   artificial intelligence or the expert systems engineering area, complex   hardware development activities, security engineering/trusted systems   engineering, safety-critical systems, where extra processes have to be overlaid   on the baseline processes… All of those relate to technical engineering activities.   But if you go to a broader development perspective, we could even think about   broader coverage of the basic program management activities. It’s too early to   say now how this is all going to come together. It’s going to depend on what   the industry and what the actual developers believe as they begin the process.

JW: Another area for future CMMI model expansion that has   been suggested by the user community is systems acquisition.

PB: And in that sense, if you look at the full life cycle of   engineering systems, you might say thatmaintenance and supportability, for   example, might be appropriate disciplines.

Relationship   with international standards

BP: I note that the A-Spec includes a requirement that   the CMMI product suite be compatible with ISO 15504. How is the relationship   with international standards evolving?

MZ: There’s certainly a sensitivity to make sure that what we   do here takes into consideration what’s happening in the international arena.   We don’t want to develop something that isolates the companies in the U.S. from   the international marketplace. A number of folks on the team—on the Steering   Group or the product development teams or stakeholder reviewers—are involved at   the international level with ISO/IEC documents. I haven’t counted them, but I   know there’s a fair number involved with the U.S. TAG (Technical Advisory   Group), to JTC1/SC7 which is the international group under ISO/IEC that’s   responsible for 15504, 12207, 15288 and other related documents. So I think   that the folks who are working in both arenas are involved, which gives the   CMMI effort first-hand knowledge of what’s happening and a view of what the   draft documents are like. That, I think, is one of the main reasons for having   it in the spec—so that we make sure we don’t ignore what’s happening on the   international arena. We want to make sure that we don’t put our industry in a   position of being unable to compete in a world marketplace.

HW: I think that sums up the whole issue. One of the things   we want to make sure of is that the term “evolving” is really the key term.   Even with a statement of compatibility, we have to recognize that over time,   even in a relatively short time, each of these activities—on the international   level, the national level, and the CMMI level—will be operating somewhat   independently, and probably out of sync. You can expect that something will   occur in one element that will affect the other, and they will adapt. And what   you eventually would like to do—and what the intent of the A-Spec is—is   to make sure that international standards are considered. What we don’t want is   that, once the stake is driven into the ground, it isn’t allowed to move simply   because we placed it there. It should move to the logical place that industry   consensus will take it. It’s more important to recognize how to remain   competitive, how to make sure you accommodate the things in our organizations   that drive how we implement and develop. This is especially important because   most of our organizations are international.

MZ: Hal brought up a very important point. Most of the things   we’re talking about are evolving. 15504 is not an international standard; it’s   a technical report that has not yet gotten to the status of an international   standard. The systems-level life-cycle model, 15288, is still at the working   draft level. That document hasn’t been circulated outside of the working group   yet for review and comment. So the only document we have to look at in terms of   a full standard in the international arena is 12207. And there’s inconsistency   between the international documents. So the challenge is not only to make sure   we’re aligned with the international standards, but also to resolve the   conflicts that exist within 12207 and 15504, for example. It’s not a very easy   thing to deal with.

Input from   the user community

BP: What efforts are you making to keep the user community   informed of the continuing evolution of the work? How are you gathering   stakeholder review comments and adjustments?

PB: What we’ve done is gone out in a number of forums and   explained, at the time, what the project was all about and given our status.   We’ve put a number of those briefings and a set offrequently asked questions (FAQs) on   the SEI Web site to inform the community about what is going on with the CMMI   effort. We’ve established a stakeholder review group, which is a group of folks   who are going to review as we go through the development effort. Members of the   stakeholder review group represent and spread the word to their very large   organizations. So we’re anticipating a lot of input, and that’s another way of   informing the community of what’s going on with the project.

BP: How about commercial industry? Any special efforts going   on to involve commercial industry?

PB: The commercial industry raised this concern in Chicago a   while back, and we took names of those who were really interested. Our lead   industry person, Bob Rassa, then went through some effort to reach out to these   folks and invite them to participate, and we did get some participation in the   stakeholder review group.

HW: Bob Rassa also asked the commercial arm of the Electronic   Industries Alliance (EIA) to go forward and see if there was any interest in   commercial industry to participate. And I think the initial reaction was that   those organizations that were heavily involved and felt that it was in their   best interest have chosen to do so. In fact, there have been several offers for   individuals to participate on the product development teams.

JW: I think that one of the things we did to ensure that we   did reach out to the commercial community was take a hard look at the   distribution channels we were using to disseminate information on CMMI.   Specifically, we talked with the SEI about the distribution lists they had used   in the past for their various correspondence groups and advisory boards, and we   included those people in the distribution for CMMI information. In particular,   we included them in the invitation to participate in the CMMI effort by   providing product development team members. We realized that we needed to   expand our communication channels and our distribution list. Hopefully, with   the additional outreach, everyone who has been involved with capability models   in the past is now aware of the CMMI Project. And, we would like to think that   everyone is tuned in to the CMMI page on the SEI Web site, so they can keep up   to date in the future on what’s happening with the project.

Role of the   Steering Group

BP: What role does the Steering Group play in guiding this   effort?

PB: The Steering Group has an important role in this   development effort. For example, we decided to take a systems engineering   approach in developing the CMMI product suite. As a result, we have written the   functional/performance requirements in the form of a formal specification (A-Spec).   The Steering Group is also responsible for configuration control of the A-Spec   and the top-level design products as they evolve. We have a role between the   sponsor and the users, and we’re representing the users in tracking progress   and approving products, so we’re providing an insight/oversight kind of   steering function to the product development teams. We’re resolving issues that   come up, and right now, we’re in the midst of planning transition—how we are   going to sustain, maintain, and support these product-suite-based products once   they’re developed. And we’re making efforts to disseminate information about   these developments as they occur.

MZ: From the very beginning, this was defined as a   collaborative effort between government, industry, and the SEI. The Steering   Group exists as a forum for having that collaborative effort to make sure that   the voices of the services, DoD, and industry are heard and help drive and   direct this effort.

HW: If you look at the way that development organizations are   set up, you find that most of those organizations have separate elements that   are concerned with each of the disciplines, and these separate elements have   perspectives that are self sustaining. The organizations that acquire and   implement systems or utilize systems within the DoD and within industry have a   lot of the same characteristics. So it’s pretty obvious when you bring a   complex set of environments and put them all together that there has to be some   coordinating and even controlling element that keeps the industry and   government needs and constraints in perspective. You could create tremendous   impacts if you just went willy-nilly. As Mike said, the Steering Group provides   a forum and a mechanism for not only bringing that discussion together, but   also bringing a consensus on what is really necessary to meet the needs of all   the constituents. We haven’t really had a means to do that in the past, and   part of the reason there were so many CMMs was that there wasn’t a means to   consider the needs of all the constituents in the past. So if we had not   brought the Steering Group into existence, the result would be far different.

PB: We have both industry and government representation on   the Steering Group, along with systems engineering and software engineering   perspectives. So in this sense, we have brought together these backgrounds and   representations, which I feel is very important. And from all of the members of   both industry and government, we reach into a number of standards bodies and   associations within the industry that are oriented toward particular   activities, and if we weren’t able to draw on this diversity, we wouldn’t be   able to bring together a consensus model.

Effect of   CMMI on assessments

BP: In what ways might the CMMI be beneficial in terms of how   if will affect assessments?

HW: Industry and government organizations have performed   assessments, either internally or externally, to benchmark where they are.   Generally, you want to go out and get someone other than yourself to let you   know if your self-assessment is valid. Our management certainly does—they don’t   believe that an organization claiming a particular level should be the one   making that assessment. For a third-party assessment, the issue is what   mechanism do you use? And I think that the way the CMMI is going is to bring   together a unified assessment approach.

JW: That’s an excellent point. I think the key is that the   mechanism and underlying model an organization is using to assess itself   internally, doing self-assessments, is consistent with the mechanism and model   used for external benchmarking.

HW: Today it would be impossible for an organization to bring   together software engineering, systems engineering, and IPD in a single   assessment approach on their own. They would have no way of arbitrating the   differences in the models to gain any result that industry could validate. What   the CMMI is going to provide is a single, unified, consensus approach. And   that’s valuable. That’s a major cost savings. The difficulty in bringing together   the different CMMs in an organization ahead of time is you’re always forced to   go back and look to see if you can still pass any one of these divergent CMMs   or Software Capability Evaluations (SCEs) from a different   perspective. The CMMI will eliminate that. That’s one of the problems of being   an early adopter—having to make sure that you’re not jeopardizing your   evaluation in any independent SCE against one CMM versus another. Because the   characteristics are slightly different—in this case, they’re somewhat   divergent—one element gets more credence in one SCE than in another. The   training of the individual and which SCE is being used, for which CMM, can   greatly affect a good basic practice. The CMMI will eliminate a lot of that.   This is one of the greatest advantages of a unified model.

JW: If the same model and the same appraisal method are used   in both internal and external appraisals, then you can achieve comparable   results. If you use different appraisal methods or different models, then   comparison of results may be difficult. As long as everyone is aligned from a   model and method perspective, then the outcomes of internal and external   appraisals can synergistically support an organization’s process improvement   program.

HW: One of the inherent and underlying concerns about legacy   has been in the rating area. Many people have expressed the concern of “How do   we know if we’ll keep the rating we have?” If you’re only dealing with one of   the CMMs now, particularly software, you’ll have a rating based on a current   model. If you continue to operate within a staged model, while there will be an   improvement within the content of the software CM product of the CMMI, there   would be some stability in that process, but there would be the same impact on   your rating as if you were going to the next level of a release. Companies that   are already moving to bring together software and systems engineering realize   that the costs involved in trying to accommodate both models today are   enormous. And that’s really the issue. Most companies that are doing complex   operations realize that they are doing systems engineering as well as software   engineering, and they need to address both in order to be mature; so the   movement toward a common assessment model is critical. Today you can’t have a   combined rating for software and systems engineering maturity. The CMMI is the   only thing that’s going to give you that.

Maintenance   of CMMI products

BP: What are the plans for maintenance of the CMMI products?   What guarantee is there that this will not be a one-time product?

HW: It’s an issue that is being addressed by the Steering   Group. We recognize that it’s a Steering Group responsibility to define a   process for maintenance and improvement of the CMMI. And that will be one of   the things coming out of this in the future.

About the   moderator

Bill   Pollak is a senior   writer/editor, member of the technical staff, and team leader of the Technical   Communication team at the SEI. He is the editor and co-author of A Practitioner's Handbook for Real-Time     Analysis: Guide to Rate Monotonic Analysis for Real-Time Systems (Kluwer   Academic Publishers,  1993) and has   written articles for the Journal of the     Association of Computing Machinery (ACM) Special Interest Group for Computer     Documentation (SIGDOC), CROSSTALK,   and IEEE Computer.

About the   panelists

Philip S   Babel is Technical Advisor for Embedded Computer Systems Software, Aeronautical   Systems Center (ASC), Air Force Material Command (AFMC), Wright-Patterson Air   Force Base Ohio. His responsibilities include technical leadership for the   application of computer systems and software technology to aeronautical systems   acquisition and development. He defines and establishes policies, processes,   practices, and methods for the engineering of embedded computer systems   software. He has a BS in electrical engineering from the University of Detroit   and an MS in computer and information systems from the Ohio State University.

Joan Weszka has over 25 years of experience in software and   systems engineering, and program management of computer systems. At IBM, she   held management and technical positions in systems development of commercial   and government large-scale, real-time, systems, and was Process Program Manager   at IBM Federal Systems Company Headquarters. She is currently Manager of   Process and Program Performance at the Lockheed Martin Enterprise Information   Systems' Software & Systems Resource Center, a service organization and   source of expert resources to Lockheed Martin companies in areas of   consultation, process improvement, training, and technology transition. Weszka   is Chairperson of the Enterprise Process Improvement Collaboration Steering   Group. She previously served as Chairperson of the SEI's SW-CMM Advisory Board   and was a member of the SEI's Software Acquisition CMM Steering Group, the SCE   Advisory Board and the CMM-Based Appraisal Advisory Board.

Hal Wilson   has been designing and implementing computer information and computer-assisted   communications systems for over 30 years. He holds the position of vice   president and general manager, 'Net Solutions, the focal point for Internet/Intranet-related   customer support activities within Litton PRC. Since joining PRC in 1984,   Wilson has directed the design and implementation of two major systems   integration programs and has directed the design, competitive selection, and   development of two large system programs.    He also created and led the Systems and Process Engineering organization   that developed process and engineering policy and procedures for systems and   software engineering within Litton PRC. Wilson is currently the chairman of the   Systems Engineering Committee (G47) of the Electronic Industries Alliance,   which is responsible for the creation of two new systems engineering standards,   ANSI/EIA 632, Processes for Engineering a System, and EIA IS 731, Systems   Engineering Capability Model. He also serves as the vice chair of the National   Defense Industrial Association Systems Engineering Committee, chartered by the   Systems Engineering Directorate of the Office of the Under Secretary of Defense   for Acquisition and Technology. Wilson holds a Bachelor of Science degree in   physics from St. John's University in New York. He also is a graduate of the   Western Electric Graduate Engineering Education program. Wilson was awarded   Federal 100 award by Federal Computer Week Magazine in 1993.  He co-holds a patent for an Alarm Scanning   Mechanism designed for the Washington Metro Communications Control System. The   views expressed in this article are the author’s only and do not represent   directly or imply any official position or view of the Software Engineering   Institute or Carnegie Mellon University. This article is intended to stimulate   further discussion about this topic.

Michael G.   Zsak, Jr. is a Systems Engineer for the Office of the Secretary of Defense   (Acquisition & Technology). He is responsible for providing technical   support to the Deputy Director, Systems Engineering and the Director, Test,   Systems Engineering, and Evaluation. In addition to his normal duties, he   serves as the DoD advisor to the International Symposium on Product Quality   & Integrity (RAMS), the DoD representative to the U.S. Technical Advisory   Group for ISO/IEC JTC 1/SC7, Chairman of the Reliability Analysis Center   Steering Committee, and Vice Chair of the Society of Automotive Engineers   International Division on Reliability, Maintainability, Logistics, and   Supportability. He is also a guest lecturer at the University of Maryland and   the Defense Systems Management College.

Please note that current and future CMMI research, training, and information has been transitioned to the CMMI Institute, a wholly-owned subsidiary of Carnegie Mellon University.

Find Us Here

Find us on Youtube  Find us on LinkedIn  Find us on twitter  Find us on Facebook

Share This Page

Share on Facebook  Send to your Twitter page  Save to del.ico.us  Save to LinkedIn  Digg this  Stumble this page.  Add to Technorati favorites  Save this page on your Google Home Page 

For more information

Contact Us



Help us improve

Visitor feedback helps us continually improve our site.

Please tell us what you
think with this short
(< 5 minute) survey.