NEWS AT SEI
This article was originally published in News at SEI on: March 1, 1999
What role do requirements play in effort estimation for net- centric applications? Are they more important in the net-centric domain? Less important? The same? This article provides an example of erratic effort estimation that can be traced, in part, to inadequate requirements engineering.
It is accepted that requirements play an essential role in the software engineering life cycle. One of the thorniest aspects of program development is getting both the customers and the developers to understand what the other means. Requirements engineering is an area of great interest to industry, yet it receives relatively little emphasis in most academic institutions, especially with regard to the software engineering curriculum.
This article provides an example of erratic effort estimation that can be traced, in part, to inadequate requirements engineering. The context for the example is an undergraduate course on software engineering. For most students, this is their first introduction to the basic tenets of software engineering, which include requirements engineering as it influences subsequent effort estimation.
The students were asked to estimate the effort required to implement several proposed changes to a Web-site analyzer program. The Web-site analyzer was used throughout the course as a framework for group projects. This type of net-centric application was relatively new to them, which added to the challenge of producing accurate estimates.
The course project was a mix of programming, testing, documenting, management, and other activities that are typical of a real-world software engineering effort. The project was to create a Web-site analyzer toolkit called “WSA.” The WSA application was created by enhancing an existing Web-log analyzer program, which was written in C and contains about 5,000 lines of code. It uses a graphics library package for creating GIF files that form the heart of the graphical reports viewable with a browser.
The course project was meant to closely resemble a realistic software engineering exercise. The students were told to imagine that they had been hired at a new software startup company. The company had just bought out a rival company because the rival hada great Web-site analyzer program, and the market was booming for that type of application. As a new employee, the student was put on a team that was given the responsibility of enhancing the newly acquired Web-site analyzer. Students were to imagine that their boss had said that WSA needed many new features to make it more competitive in the marketplace.
Some of the boss’s requested functionality could be found in other software packages, but the team’s budget was nonexistent for capital purchases. Moreover, there were fewer than 10 weeks to ship the product (coincidentally, the length of the academic quarter), and going through the formal procurement process would take longer than 10 weeks. The team members were left with just one choice: they had to learn how the current Web-site analyzer program worked and enhance it. The students were warned to expect that some of the desired features would change over the 10-week period, and that their boss was fickle. In other words, they were told to expect requirements creep.
Before the students were asked to provide effort estimates for proposed changes, they had been taught about the different types of software maintenance. However, they had been exposed to very little in the way of formal cost-estimation techniques. For example, they had not yet learned about function points as a means of measuring programmer productivity. They were asked to use the more accessible lines-of-code measure, even with its many limitations.
This was by design. The goal was for the students to use their own insights and methods for performing their first effort estimation. They would then learn more disciplined techniques, such as algorithmic cost modeling, and in subsequent exercises would compare the effort estimates arrived at using both methods.
As a team, the students were asked to complete a change request form (CRF) for five different proposed changes. Part of the CRF was focused on estimating the effort required to carry out the change. Effort in this context included the number of lines of commented source code required to be changed, the number of hours required to implement the change, the number of people required, and any other information that the students deemed to be relevant. They were also asked to characterize each change as corrective, adaptive, or perfective maintenance.
To put the effort-estimation exercise in the context of their hypothetical company, the students were told to think of their group as a consulting firm bidding on a lucrative contract. On the one hand, they would want to provide an accurate bid that was low enough for their firm to be awarded the contract. On the other hand, they did not want to come in with too low a bid, because the contract would be awarded for a fixed price. That meant the deadline for finishing all change requests was fixed. Their firm would pay for any work required past the deadline, which was something they obviously wanted to avoid and to which they could relate.
The students were repeatedly told that it was in their best interest to try to be as accurate as possible with their estimates. The next assignment focused on implementing some of the changes, and potentially others not listed as one of the five requests. However, they did not know at the time which changes would be implemented.
Some of the change requests were worded in an intentionally vague manner. For example, one request said simply, “Port WSA from Linux to Windows NT.” Others provided more detail. One request even related to something they had already implemented in the previous assignment. In that case, it was more of a post-change effort analysis than effort estimation. The change request that they had to implement for their next assignment was worded as follows:
Add a graphical display to WSA that represents the information in the Web logs. For example, display each HTML page accessed, the accessing site, and the referring site (if available) as a series of nodes and edges. Assume the graphical display package is a prebuilt component that you just need to integrate with WSA.
At this point, the students did not know anything about the graphical display package that they had to integrate into their version of WSA, except that it would be provided to them as a working component. They did understand the structure of Web logs, since it was the focus of the implementation effort on their previous assignment. The estimations from the different groups on this change request made for very interesting reading.
Perhaps not surprisingly, the seven student groups provided effort estimations that were all over the map. Most groups did not appear to understand what the requirements meant, and therefore their estimates were based on inaccuracies from the start. The students tended to give the change a low priority relative to the other change requests. This was
yet another instance of not being able to predict what the customer really wants!
Being typical engineering students with heavy class loads, they tended to perform their effort-estimation exercise the night before the assignment was due, leaving little time for careful thought and group discussion. However, in that respect, they were not all that different from professional software engineers who are often asked to provide an on-the- spot effort estimate based on ill-defined requirements and unknown project parameters.
The effort estimation included the number of lines of commented source code required to be changed, the number of hours required to implement the change, and the number of people required. Groups summarized the change using words such as “easy” and “long- term process,” with many adjectives in between. They estimated the number of lines of code that would have to be changed in WSA to be from 10 to 300. As the “boss” of their company, I know that a considerably higher amount of source code would need to be changed and added. However, underestimating the impact of a change request is very common, even for experienced programmers. They estimated that one, two, or three people would be required—good thing the groups had no more than four people.
The most surprising degree of variation was their time estimates. The amount of estimated time required for implementing the change varied from 2 hours to 24 hours to 120 hours to 8 days. Quite a wide range!
Why were the students’ estimates so varied? Since they were juniors and seniors, they did have some experience with programming projects, but very little experience working in teams. Their knowledge of real-world programming projects was also quite limited.
The application domain of WSA definitely played a role, since this type of net-centric application area was unfamiliar to them. Although the format of the Web logs was explained to them, and they had already implemented a minor change to WSA in their previous assignment, the application domain was not what they were used to. Other courses focus on traditional areas, such as compilers and operating systems. Even the networking course focuses on network fundamentals, not necessarily net-centric applications. Consequently, effort estimation without any previous experience was handicapped from the start.
Although the application domain was new to the students, the essential problem remained. Unclear requirements lead to unsatisfactory effort estimates. In other words, requirements engineering for net-centric applications is just as important as in other application domains—perhaps even more so, given the novelty of the domain and the technology. This should not be a surprising conclusion, since requirements drive so much of any software project.
Were the requirements stated fairly? Perhaps not, but they were realistic. I deliberately did not state the requirements in a formal specification, in part because the students had not yet learned about such notations. The requirements were stated in typically ambiguous prose. Consequently, most students did not really understand what I meant, and this was reflected in their erratic effort estimates.
Scott Tilley is a Visiting Scientist with the Software Engineering Institute at Carnegie Mellon University, an Assistant Professor in the Department of Computer Science at the University of California, Riverside, and Principal of S.R. Tilley & Associates, an information technology consulting boutique. He can be reached at email@example.com.
The views expressed in this article are the author’s only and do not represent directly or imply any official position or view of the Software Engineering Institute or Carnegie Mellon University. This article is intended to stimulate further discussion about this topic.
The Software Engineering Institute (SEI) is a federally funded research and development center sponsored by the U.S. Department of Defense and operated by Carnegie Mellon University.
For more information
Please tell us what you
think with this short
(< 5 minute) survey.