NEWS AT SEI
This article was originally published in News at SEI on: December 1, 1999
The past two years have seen interest in net-centric computing wax, wane, and then wax again. At the moment, there appears to be a resurgence of interest in the area, driven in part by the proliferation of non-traditional computing devices, by dramatic changes in networking capabilities, and by renewed interest in centralized system administration. In this last column of Net Effects for 1999, I take a look at what may be in store for net-centric computing in 2000. When this column first debuted over 18 months ago, I remarked that it’s important to make a distinction between network computers (NC) and net-centric computing (NCC). This remains even truer now than it was then. In part, this is because of the limited adoption of NCs in the marketplace. But NCC itself hasn’t gone away; it’s just changed to better reflect the current trends in technology. In the book Saving Big Blue: Leadership Lessons and Turnaround Tactics of IBM’s Lou Gerstner by Robert Slater (New York, NY: McGraw-Hill, 1999), the topic of IBM’s continuing interest in network computing is covered at length. For IBM, the combination of NCC and the growth of the Internet jointly contribute toward what they see as their future: e-business. The emerging e-business economy plays to IBM’s traditional strengths of sophisticated computer technology, systems-integration capabilities, and large storage systems. Although sales of NCs by IBM have not been as high as IBM had hoped, the overall vision of a net-centric computing future in 2000 and beyond appears to remain undiminished. As an aside, Y2K purists know that 2001, not 2000, is the first year of the new millennium. Equating 2000 with the new millennium has become so common that it no longer seems important to point out this discrepancy. Besides, the confusion gives us pundits two opportunities to predict the future, in case we get it wrong the first time.
Will there come a time when information appliances attached to the Internet will outnumber the typical personal computer (PC)? Many people, including proponents of NCC such as Oracle’s Larry Ellison, seem to think so. The past year has seen a proliferation of non-traditional computing devices that are network-aware. Witness the popularity of Palm Computing’s Palm VII, which provides wireless Internet access (albeit in selected cities) for mobile professionals. If Sun Microsystem’s Jini architecture for connecting information appliances to a network in a seamless and simple manner is a success, we may indeed see a glut of new non-traditional computing devices in 2000 and beyond.
There seems to be a change in the perception of what a computing device actually should be. Currently most PCs are general-purpose machines that are designed to do many things, although some would argue that they don’t do any one of these things very well. For example, my notebook computer comes with fax software—software that I rarely use. I have found it to be difficult to use and buggy, causing odd interactions with my email software. Consequently, I rely on a dedicated “old-fashioned” fax machine. It may not be quite as convenient, but at least it works all the time, and in a predictable manner. For the average user, the complexity of today’s PCs often outweighs their potential advantages. Indeed, it has reached the point where even the venerable Wall Street Journal is running articles on the difficulty in using a mainstream PC. Columnist Walter Mossberg predicts that 2000 will signal the beginning of a new era of computing, one in which information appliances will begin to dominate the computing landscape. Rather than being general-purpose machines, information appliances are special-purpose machines that are designed to do one thing, but to do it well. Upgradability by the consumer is limited, but the tradeoff is enhanced usability and stability. Such information appliances are in fact already here, but we don’t always consider them “computers” in the traditional sense. Nor should we. But they are available now and are already in fairly widespread use. For example, Sega’s new Dreamcast gaming system is an example of an information appliance that comes with special-purpose hardware and items typically seen only in a PC, such as a modem for dialup Internet connections. It is no accident that WebTV now has over one million users, or that AOL is fast approaching twenty million users. Consumers in general opt for electronics that work, even if they are more restrictive than the typical PC favored by geeks who don’t mind tinkering with their machines on a daily basis to keep them running. Manufacturers of traditional PCs are reacting to the move to simpler computers, and information appliances in general. AOL is readying a TV-based interface to its popular Web service. Oracle is rumored to be working on a Linux-based NC. Even Compaq and Hewlett-Packard are reported to be developing a new “e-PC” that will abandon some of the legacy interfaces found in today’s PC (such as the ISA bus or the parallel port) with more limited, but hopefully more reliable, capabilities. Manufacturers are not doing this for purely altruistic reasons of course; they see the adoption of information appliances as a new source of revenue. This is due in part to the need to upgrade (read replace) the devices more quickly than traditional PCs, since the information appliances cannot easily be upgraded.
The role of the network in NCC is obvious. One of the more inevitable developments has been the increase in the number of homes with more than one personal computer (or information appliance, as described above). This has caused a new market to be quickly
SEI Interactive, December 1999 http://interactive.sei.cmu.edu 3 created, replete with products that provide inexpensive and (relatively) easy-to-manage home networks. The networks leverage the existing infrastructure to provide connectivity throughout the home. There are essentially three types of home networks: telephone line, power line, and wireless. Telephone-line networks require only a standard telephone jack; plugging a cord into the jack and connecting the other end to the computer creates a simple yet acceptable network (when used with the accompanying software). Power-line networking operates similarly, using small adapters that plug into standard wall outlets and then into computers. Although the speed provided by such home networks is not (yet) spectacular, it is sufficient for sharing printers among several PCs, for sharing Internet connections, and for occasional file transfer between devices. Perhaps the most promising type of maturing technology is wireless networking. If you’ve ever used the infrared port on your PC to communicate with another similarly enabled device (say, another PC or a printer), you already appreciate the convenience that wireless connectivity provides. By having networks unrestricted by physical connections, computing devices such as NCs can be used in a location-independent manner. Indeed, wireless networks can be used to facilitate the roaming of information appliances from one location to another, in a manner similar to cellular phone calls as they are passed from cell to cell. This type of wireless access is already being made available to users of digital PCS (Personal Communications Services) phones from several vendors, enabling them to surf the Web from their telephones at any time and from anywhere.
Ubiquitous computing has been the focus of research efforts at Xerox PARC and other institutes for some time. Central to the success of ubiquitous computing is the availability of inexpensive and high-speed wireless networking. The emerging IEEE standard 802.11, which directly addresses wireless networking, will surely contribute to the widespread adoption of ubiquitous networking.
One of the biggest advantages of personal computers is their “personalization” capability, which permits end users to add their own hardware or software and generally change system configurations without the support or knowledge of the organization’s information technology (IT) group. However, this advantage is also viewed as one of the PC’s greatest shortcomings as well. From a support point of view, decentralized computing comes at a very high cost. Many IT veterans yearn for a return to a centralized administration setting, where applications can be managed and deployed in a more cost-effective and timely manner. Until recently, most users were reluctant to adopt this model of computing, since it often came at the cost of removing the computing environment they were familiar with. This is no longer the case, and the move to centralized administration in a net-centric setting is only
SEI Interactive, December 1999 http://interactive.sei.cmu.edu 4 expected to increase as network bandwidth and powerful servers become more abundant and more economical. Lucent Technologies’ recent announcements of an all-optical terabit router is a glimpse at the future of broadband networking, a future where bandwidth would appear to be sufficient to support the most likely usage scenarios of centralized administration of interconnected information appliances. Given the pervasiveness of the Microsoft Windows operating system, a solution to managing common office software is a prerequisite for the success of centralized administration. Fortunately, both Microsoft and other companies (notably Citrix Systems) have addressed the problem through extensions to NT 4.0 Server (called Terminal Server Edition, or TSE); similar capabilities are built into the forthcoming Windows 2000 product line. Citrix’s WinFrame and MetaFrame software imparts relatively low-power information applications with the ability provide users with a complete Windows NT session, but without the NT operating system and the associated application software packages installed on the thin client. This is done by migrating most of the computing to a few centralized hosts, and communicating to the host from the client using a simple, communication-and-display protocol. WinFrame provides these capabilities for devices running Windows, while MetaFrame provides similar capabilities for non-Windows devices. Given the current popularity of the freely available Linux operating system, the allure of a MetaFrame-like solution is obvious. Since Linux has far less resource requirements (in terms of CPU processing speed, memory, and disk) than a typical Windows installation, existing hardware that would otherwise be made obsolete can be used as thin clients to access servers running Windows NT TSE, in effect providing users with the same Windows experience that they are accustomed to, but at a much lower cost and with a much reduced administrative overhead. It is expected that the complexity of the Windows 2000 product will only add to the adoption of Linux as a thin-client solution for many organizations.
Scott Tilley is a visiting scientist with the Software Engineering Institute at Carnegie Mellon University, an assistant professor in the Department of Computer Science at the University of California, Riverside, and principal of S.R. Tilley & Associates, an information-technology consulting boutique. He can be reached at firstname.lastname@example.org. The views expressed in this article are the author’s only and do not represent directly or imply any official position or view of the Software Engineering Institute or Carnegie Mellon University. This article is intended to stimulate further discussion about this topic. The Software Engineering Institute (SEI) is a federally funded research and development center sponsored by the U.S. Department of Defense and operated by Carnegie Mellon University.
For more information
Please tell us what you
think with this short
(< 5 minute) survey.