The headline and accompanying page-one story in Network World were newsworthy because they focused on a perception that exists in some quarters: The mainframe is an endangered species that's about to become extinct. Once the cornerstone of all computing, the mainframe has been bloodied by the growth of client/server computing, the rise of open systems, and the lack of needed application development tools.
But reports of the mainframe's imminent demise appear to be greatly exaggerated. After a sales decline that bottomed out in 1993, mainframe sales turned around smartly and are expected to grow at a 27 percent annual rate over the next six years, according to International Data Corp., a research organization in Framingham, Mass.
"A lot of this growth is occurring in the replacement market as companies realize that moving to client/server was more difficult, time-consuming, and costly than had first been thought," says Charlie Burns, research director at Gartner Group, a technology consulting organization in Stamford, Conn. "After they had gone through the effort and additional expense, they frequently found that functionality and robustness were a step down from what they had been used to in mainframe environments.
"The U.S. learned that from the school of hard knocks," Burns adds, "so we'll see the Europeans move more carefully to client/server as a result."
There are four key reasons for the anticipated solid growth of mainframes, which are characterized by their abilities to process large amounts of data at high speeds as well as store and manage large quantities of data.
First, the mainframe offers a stable and well-understood computing environment. Second, despite its current undeserved reputation as a relic from computing's early ages, the mainframe still offers powerful computing capabilities, data storage, and other benefits unmatched by lesser machines. Third, the mainframe is becoming more competitive as new tools -- and even new applications such as multimedia -- are added, to leverage its strengths and expand its flexibility. And last, more and more businesses are learning how to use the mainframe as a key player in client/server environments.
"There were advantages in consolidating information on a single platform," Filipowski adds. "Moreover, IBM started to offer appropriate pricing schemes. As a result, open-minded organizations began to understand that all platforms have permanent viability within corporate networks."
This shift, considered by many as welcome and necessary, came at a price. In adapting to a new computing paradigm, companies have had to reinvent many of the time-tested tools and techniques developed for mainframes over the decades. These tools handle security, system management, fault tolerance, data integrity, disaster recovery, and other requirements. In many cases, such tools for client/server systems haven't evolved to the same level of sophistication as their mainframe counterparts.
Another disparity is the level of knowledge and expertise between mainframe and client/server operations. The IT professionals dedicated to mainframe operations usually have extensive training. In the client/server world, by contrast, critical functions such as daily backup sometimes depend on end users who have other business priorities and may be barely able to format a disk.
Companies have also found that, within a distributed environment, ensuring the same levels of data integrity, access, and administration adds significantly to the cost of computing over the long term. So do outlays for backup, software distribution, and storage management. Although all these costs don't appear on a financial statement, they can still significantly dent corporate profitability. In fact, such hidden charges as training, support, and management account for more than 70 percent of client/server network costs over five years, according to a Gartner Group study.
So, despite all the talk of "scrapping big iron," companies have found it impractical to unplug their mainframes -- because they still hold the bulk of corporate data. And mainframes run proven applications that would be difficult, expensive, and time-consuming to recreate in another environment.
A key challenge, therefore, is to find cost-effective ways to take advantage of the mainframe's traditional strengths while integrating and adapting the mainframe to new computing environments -- turning the mainframe into a key player in client/server computing. Another task is to explore new applications, such as multimedia computing, which can exploit the mainframe's matchless data processing, throughput, and data-storage capabilities.
The mainframe is also being used as an "information warehouse." For example, the Bank of Boston, a super-regional bank headquartered in Boston, Mass., will have a mainframe supporting an advanced decision-support system as part of a three-tiered client/server system. It will aggregate data from multiple legacy systems, each devoted to a separate area such as retail or finance. Bank executives will be able to seamlessly access "big picture" information from desktop PCs, drill down through the data for details, then make a decision -- confident that they have considered all the appropriate data. "Volume and storage capacity issues require the information to be centralized on our host and distributed to departmental servers," says Jack Sweeney, director of information management resources.
A major benefit, says Sweeney, is that "we now can take advantage of tried-and-true approaches for ensuring data integrity, security, systems management, backup recovery, and other important requirements.
"We're not justifying this in terms of money saved," Sweeney adds. "Instead, it's enabled us to look at information collected from disparate legacy systems in new ways. You may be able to show hard dollar savings from a limited implementation, but an enterprisewide solution is like putting phones on every desk -- the benefits are intuitive."
These products essentially allow companies to put anything on a partitioned mainframe disk that they can put on a native file server, including OS/2 LAN Server and Novell's NetWare. Such an application provides key advantages for both companies and individuals.
For example, when companies use mainframe tools such as fast dump/restore (FDR) and data facility data set services (DFDSS), a mainframe can back up a 1.5-gigabyte (GB) LANRES disk in about 20 minutes, compared with about four hours using LAN-based backup tools. Disaster recovery, of course, is much simpler when an enterprise backs up its data on a host rather than on multiple servers.
Even though the data is centralized, clients can transparently access it as rapidly as if it resided on their local disk, regardless of which operating system they are running. This method helps reduce the cost of distributing data across networks, mitigates the still-unsolved problems of data synchronization, and overcomes the need to continually add disk storage space to multiple servers. Users are also able to access large files (one GB or more) on a mainframe much more easily than on a server.
LANRES also supports simultaneous file access. If users need additional storage, the traditional solution is the simplest: add another DASD unit.
Important administrative and printing capabilities are provided by LANRES/VM. St. Elizabeth Health Center depends on the software to distribute centralized data files to multiple servers, and to administer user IDs and resource access within attached LANs. Some users at St. Elizabeth's take advantage of LANRES's "any-to-any" printing capabilities to route local print jobs through the mainframe, reducing 5-hour printout jobs to less than 10 minutes.
To further integrate mainframes into a client/server environment, IBM is offering an Open Systems Adapter (OSA) that will permit users to directly attach up to 80 Token-Ring or Ethernet LANs to a System/390 or ES/9000 processor. Fiber Distributed Data Interface (FDDI) users can connect as many as 32 LANs. Open Systems Architecture -- an integrated card that includes a channel adapter, control unit, and LAN adapters -- supports Systems Network Architecture (SNA)/Advanced Peer-to-Peer Networking (APPN), Transmission Control Protocol/Internet Protocol (TCP/IP), and Internet Packet Exchange (IPX) communications protocols.
"In many respects, this product is symbolic of the transformation of large-scale computing because it's small, easy to use, supports client/server applications, and, ultimately, drives down the total cost of computing for customers," says Nick Donofrio, IBM senior vice president and group executive. "When teamed with our other client/server software products, OSA delivers large system muscle to end users. Yet it's transparent. End users need not learn anything new to gain access to host services."
Long-distance education has gone through several phases, according to Jim Emal, professor and computing coordinator at the university. The first phase used the Internet to access and download text-based information. Next, other Internet software brought audio and some video into the classroom. However, such multimedia applications weren't in real-time. Usually, someone had to download the information, store it on a local disk drive, and then replay it. "Instructors were wasting classroom time with song-and-dance routines during the downloading," says Emal.
But now, a program called Eduport -- already demonstrated before a Congressional committee -- is spurring the next advance in education. By using an IBM ES/9000 running over a Token-Ring network, with MVS and LFS/ESA working in conjunction with a channel-attached OS/2 LAN Server for video streaming, the university can show students such material as President Franklin Roosevelt's "Day of Infamy" speech; NASA space footage; physics, geophysics and other scientific phenomena; programs from the Kennedy Center for the Performing Arts; and much, much more. The university is now adding its own digitized teaching material to its archives. The mainframe is necessary for storage because an hour of even compressed video requires 750 MB to 800 MB.
"The data, compressed to quickly move it over the network, shows up on the computer as 30 frames-per-second, full-screen, TV-quality video," says Emal. "Students who have grown up with TV think it's perfectly normal, but everyone else's eyes light up at the possibilities offered by moving the power of TV into a multimedia situation."
One possibility is long-distance learning. The university is linked by a three-mile, fiber-optic cable to Lincoln High School. Teachers at the high school can choose and play the appropriate material to illustrate their lectures. Eventually, students and teachers will be able to integrate current and historical text, audio, and video data to make their own multimedia presentations.
"This not only breathes new life into our mainframe investment, but, more importantly, it means learning doesn't have to stop where the university ends," says Emal.
Another university that is exploring the frontiers of multimedia is California Polytechnic State University (CalPoly) in San Luis Obispo, Calif. It will be delivering multimedia courseware on business and engineering applications over a high-speed network to a development lab and classrooms. An IBM ES/9000 running LFS powers the application.
"The system is extremely transparent to users," says Anna Seu, director of multimedia development. "To the professors, it's like a big disk in the sky; they don't have to worry about system management or other details. It will be an important step in achieving education on demand; people will be able to educate themselves at their own pace and during their own time."
Both CalPoly's Seu and the University of Nebraska's Emal point out that the biggest obstacle to widespread multimedia access is not generating the applications, but delivering them. Both universities plan to upgrade their network infrastructure to accommodate multimedia's high-bandwidth demands.
So, it appears that the rebound in mainframe sales does have legs. The rebound is driven by transforming the mainframe's role into a server in client/server environments; by innovative applications; and by combining the mainframe's traditional strengths, like powerful number crunching, with innovative new applications such as multimedia. And the trend will continue, as long as IT managers build their computer plans on what's best for their businesses, instead of what's reported in the popular press.
See also:
Going Big Time With 'Big Iron'