NEWS AT SEI
This article was originally published in News at SEI on: March 1, 2000
Editor's Note: This column is adapted from testimony by Richard Pethia, director of the CERT Centers at the SEI, before the Senate Judiciary Committee on Technology, Terrorism, and Government Information, March 28, 2000.
This column will describe a number of issues that have impact on security on the Internet and outline some of the steps I believe are needed to effectively manage the increasing risk of damage from cyber attacks. My perspective comes from the work we do at the CERT Centers, which includes the CERT Coordination Center (CERT/CC) and the CERT Analysis Center (CERT/AC) [For a description of these centers, see About the Author]. In 1999, the staff of the CERT Centers responded to more than 8,000 incidents. Since it was established in 1988, the CERT/CC staff has handled well over 24,000 incidents and analyzed more than 1,500 computer vulnerabilities.
An Ever-Changing Problem
The recently publicized rash of attacks on Internet e-commerce sites reminds us once again of the fragility of many sites on the Internet and of our ongoing need to improve our ability to assure the integrity, confidentiality, and availability of our data and systems operations. While it is important to react to crisis situations when they occur, it is just as important to recognize that cyber defense is a long-term problem. The Internet and other forms of communication systems will continue to grow and interconnect. More and more people and organizations will conduct business and become otherwise dependent on these networks. More and more of these organizations and individuals will lack the detailed technical knowledge and skill that is required to effectively protect systems today. More and more attackers will look for ways to take advantage of the assets of others or to cause disruption and damage for personal or political gain. The network and computer technology will evolve and the attack technology will evolve along with it. Many information assurance solutions that work today will not work tomorrow.
Managing the risks that come from this expanded use and dependence on information technology requires an evolving strategy that stays abreast of changes in technology, changes in the ways we use the technology, and changes in the way people attack us through our systems and networks. The strategy must also recognize that effective risk management in any network like the Internet is unlikely to come from any central authority, but can only be accomplished through the right decisions and actions being made at the end points: the organizations and individuals that build and use our interconnected information infrastructures. Consider this:
- We have distributed the development of the technology. Today's networks are made up of thousands of products from hundreds of vendors.
- We have distributed the management of the technology. Management of information technology in today's organizations is most likely distributed, and the trend toward increased collaborations and mergers will make that more likely in the future.
- We have distributed the use of the technology. The average computer user today has little in-depth technical skill and is properly focused on "getting the job done" rather than learning the nuances and idiosyncrasies of the technology.
- We must distribute the solution to the information assurance problem as well. The technology producers, organization and systems managers, and systems users are the only ones who can implement effective risk management programs.
In the long run, effective cyber defense will require
- expanded research programs that lead to fundamental advances in computer security
- new information technology products with security mechanisms that are better matched to the knowledge, skills, and abilities of today's system managers, administrators, and users
- a larger number of technical specialists who have the skills needed to secure large, complex systems
- improved abilities to investigate and prosecute cyber criminals
- increased and ongoing awareness and understanding of cyber-security issues, vulnerabilities, and threats by all stakeholders in cyberspace
I will focus on removing barriers to the last of these: building an ongoing awareness and understanding of cyber-security issues.
Building Awareness and Understanding
Information technology is evolving at an ever-increasing rate with thousands of new software products entering the market each month. Increasingly, cyber security depends not just on the security characteristics and vulnerabilities of basic networking and operating system software, but also on the characteristics and vulnerabilities of software used to implement large, distributed applications (e.g., the World Wide Web). In addition, attack technology is now being developed in an open source environment where a community of interest is evolving this technology at a rapid pace. Several significant new forms of attack have appeared in just the past year (for example, the Melissa virus and Love Bug worm, which exploit the widespread use of electronic mail to spread at network speeds, and distributed denial-of-service tools that harness the power of thousands of vulnerable systems to launch devastating attacks on major Internet sites). It is likely that attack technology will continue to evolve in this "public" forum and that the evolution will accelerate to match the pace of change in information technology. Once developed, this attack technology can be picked up and used by actors with significant resources to hone and advance the technology, making it a much more serious threat to national security and the effective operation of government and business.
The overall picture of vulnerability and threat is complex, but it must be understood to develop effective cyber-defense strategies. Building this understanding requires collection and analysis of information on
- the security characteristics and vulnerabilities of information technology
- evolving attack technology
- cyber attacks
- cyber attackers
- the effectiveness of defensive practices and technologies
Using this understanding to develop effective defense strategies requires
- providing technology producers and the rapidly growing community of system operators with information from the analysis activities
- convincing this community to act on this information to reduce serious vulnerabilities and implement effective security controls
The tasks described above are currently being conducted by a loose-knit network of cooperating organizations. Each organization focuses on its area of expertise and the needs of its customers or constituents. Each organization shares as much information as it can with others. Many varied organizations participate in this network, including federal, state, and local investigative organizations, security incident response teams, government labs and federally funded research and development centers, security researchers in universities and industry, technology producing organizations, security product and service vendors, system and network operators, and government agencies chartered to conduct security improvement efforts. The work of these organizations would be facilitated if the roadblocks described in the next section were removed.
Roadblock: The Federal Debate Over Who's in Charge
The ongoing federal debate over who's in charge and whether or not the grand analysis center in the sky should be established is only detracting from the real work that is going on in qualified organizations. The Department of Defense must conduct data collection and analysis activities to operate and protect its networks. The FBI and NIPC (National Infrastructure Protection Center) must conduct data collection and analysis activities to carry out their missions of criminal investigation and infrastructure defense. The GSA (General Services Administration) and NIST (National Institute of Standards and Technology) must conduct data collection and analysis activities to carry out their missions of dealing with incidents and improving security in the civilian agencies. University and industry researchers are among the best resources available to understand the evolution of information technology, attack technology, and the interplay between them. Other organizations must conduct data collection and analysis activities to meet the needs of their customers and sponsors. Attempts to replace these activities with one central data collection and analysis activity are misguided and seemingly miss the following realities:
- If you build it, they won't come. Sharing of sensitive security information is dependent on the trust relationship established between the information sender and receiver. These relationships are fragile, often take years to establish, and cannot be replaced by changing mandates or reassigning responsibilities.
- It is not possible to build an overall, comprehensive picture of activity on the networks. In spite of the strong desire to "see it all" so we can "understand it all," it is simply not possible to build a comprehensive view of activity on the networks. They are too big; they are growing too quickly; they lack the needed sensors; and they are literally being reconfigured and re-engineered on the fly. The challenge is not to pull all the data together, but to ensure that the right data is at the right place at the right time to allow local decision-makers to take effective action.
- All the talent needed to perform the analysis cannot be collected in one place. The detailed analysis work that must be done requires a combination of talents and skills and the best people that we can find. Organizations are not willing to give up their best people to other organizations, and the people are not willing to move. It is much more effective and efficient to move the data than to move the people. What is needed is an information-sharing network where data can be shared among organizations and analysis conducted at different sites for different reasons. The challenge is not to pull all data together, but to push it out to meet the varying needs of the various audiences.
- Centralization is not more efficient. Any central organization, unfamiliar with the operational needs of any particular network operator, technology developer, or researcher, will only be able to perform generic analysis tasks that yield high-level results. The detailed work must still be done to develop the detailed strategies and plans needed to build an effective cyber defense. Centralization is more likely to increase costs rather than decrease them. What is needed is increased collaboration among all players able to contribute to and draw from a growing body of data and knowledge.
Roadblock: Inadequate Resources for the Work That Must Be Done
The federal government has studied and debated the cyber-security problem for years. The newest flurry of activity began with the Presidential Commission on Critical Infrastructure Protection in 1996 and has led to the establishment of the National Infrastructure Protection Center and the creation of the National Plan for Information System Protection. However, many of the views being discussed and debated today are echoes of earlier studies and conclusions. The 1989 study, Computers at Risk1, which was funded by the Defense Advanced Research Projects Agency (DARPA), reached many of the same conclusions and recommended many of the same actions as the more recent studies. What has been missing is action and funding to take the steps needed to deal with this problem effectively. In spite of the nearly exponential growth of security incidents and security vulnerabilities over the last ten years, there has been little increase in budget to deal with these problems. Analysis centers must be resourced, information-sharing infrastructures must be established, and transition activities that move needed information and security solutions to their eventual users must be staffed. We will make progress when we invest in making progress.
Roadblock: Lack of Protection for Sensitive and Company-Proprietary Data
Information sharing between the private sector and the federal government is impeded by the lack of protection from the Freedom of Information Act and other forms of disclosure. Organizations that are the victims of cyber attacks can contribute greatly to the understanding of cyber defense by providing detailed information regarding the security incidents they have suffered: losses, methods of attack, configurations of systems that were successfully attacked, processes used by the organization that were vulnerable, etc. Much of this information is extremely sensitive and could be used to damage the corporation if it became public. In addition, corporations often have more to lose from damaged reputations than from the attacks themselves. These organizations will not share security incident or loss information unless they have a high degree of confidence that this information will be protected from public disclosure. The federal government must take steps to protect the sensitive data as a precursor to information sharing. Only then will it be possible to form the trust relationships and begin data-sharing activities.
Roadblock: Lack of Information on Threats
Any effective risk management strategy requires an understanding of three things:
- the value of the assets that must be protected and the consequences of loss of confidentiality or operational capability
- the vulnerabilities that could be exploited to bring about the losses
- the threats that exist-the actors who would exploit the vulnerabilities and some indication of the probability that they would do so
Today we are awash in information regarding vulnerabilities in our technologies and our networked systems. Computer security incident response teams warn their constituents of vulnerabilities that are being exploited. Internet news groups routinely publish descriptions of vulnerabilities and methods to exploit them. Technology vendors alert their customers to vulnerabilities in their products and provide software upgrades to correct them. Conferences and training courses abound that focus on corrections to vulnerabilities.
At the same time, system and network operators are becoming increasingly aware of the value of their information assets and of their growing dependence on the Internet and other communications infrastructures. The current emphasis on electronic commerce and use of the Internet as a powerful marketing and sales tool is sure to accelerate this understanding. With all this focus on value and vulnerability, why are so many organizations taking so little action to improve their cyber-security? Because they have little hard data that convinces them that there are real threats to their operations. We all know that we are vulnerable to many things. Our cars are vulnerable to certain forms of attack. Our homes and places of business are vulnerable to certain forms of attack. As individuals, we are vulnerable to certain forms of attack. Yet we are not all driven to distraction by this sea of vulnerability. We first focus not on vulnerability but on threat. We act to correct vulnerabilities when we believe there is a significant probability that someone will take advantage of them. The same is true in cyberspace. Operational managers know that they cannot afford to eliminate every vulnerability in their operations. They need data to help them understand which ones are most critical, and which ones are likely to be exploited.
Our law enforcement and intelligence organizations must find ways to release threat data to the operational managers of information infrastructures to motivate these managers to take action and to help them understand how to set their priorities. In the absence of a smoking gun, it is unlikely that many organizations will have the motivation to invest in improved cyber defense.
1 National Research Council. Computers at Risk: Safe Computing in the Information Age. Washington, DC: National Academy Press, 1991.
About the Author
Richard D. Pethia is the director of the CERT Centers at the Software Engineering Institute (SEI), Carnegie Mellon University. The CERT Centers include the CERT Coordination Center and the CERT Analysis Center. The SEI has operated the CERT/CC since 1988, and has provided a central response and coordination facility for global information security incident response and countermeasures for threats and vulnerabilities. The recently established CERT Analysis Center addresses the threat posed by rapidly evolving, technologically advanced forms of cyber attacks. Working with sponsors and associates, this center collects and analyzes information assurance data to develop detection and mitigation strategies that provide high-leverage solutions to information assurance problems, including countermeasures for new vulnerabilities and emerging threats. The CERT Analysis Center builds upon the work of the CERT Coordination Center.
Prior to becoming director of the CERT Centers, Pethia managed the SEI Networked Systems Survivability Program, focusing on improving both the practices and understanding of security and survivability issues relating to critical information infrastructures. Before coming to the SEI, Pethia was director of engineering at Decision Data Computer Co., a computer system manufacturer in Philadelphia. There he was responsible for engineering functions and resource management in support of new product development. Pethia also was manager of operating systems development for Modular Computer Corp. in Fort Lauderdale, FL. While there, he led development efforts focused on real-time operating systems, networks, and other system software in the application areas of industrial automation, process control, data acquisition, and telecommunications.