Software Quarterly

Ease on Down the Road to Client/Server:
The Open Blueprint

By Diane S. Baron

Diane S. Baron is program manager, Open Distributed Strategy, in the Systems Software Structure directorate of IBM's Corporate Operations Staff. She wrote the white paper Introduction to the Open Blueprint: A Guide to Distributed Computing. Diane joined IBM in 1968 as an applications programmer and has served in a variety of field and headquarters staff and management positions, all concerning software. She holds a bachelor's degree in mathematics from New York University, New York.

Client/server. If you haven't heard or used that term recently you've either been too busy running the family buggy-whip business or have just arrived back on earth from a sojourn to another planet ... or perhaps you're just not involved, even remotely, with information processing.

In the computer industry press, however, client/server isn't just mentioned regularly, it's often touted as the answer to many business problems.

Want to increase productivity? Try client/server. Need to reduce costs? Slash away with client/ server. Determined to speed your products to market? Zip down the client/server fast lane.

But despite all the industry hype, client/server is NOT a panacea. So just what is it? What are the key benefits? And how might you get there?

Knowledge Is Power

It sometimes seems there are almost as many definitions of the term client/server as there are vendors selling information systems (IS) and organizations implementing client/server solutions. But the common thread in nearly all implementations is the movement of information and processing power away from the host-centric "glass house" into the network and onto users' desktops.

Basically, any client/server system consists of a number of computers, operating system platforms, application software, and data--all tied together by network software--that brings up-to-date business information directly to the user's desk. No matter where a client/server installation is located, the user appears to be at the center of its powerful information and computer services. As a user you have, in effect, more than a computer on your desk; you have the capabilities of the entire network. And the network appears to you as if it were a single system!

Another key benefit is that client/server implementations can speed the information where it's needed, when it's needed, and in the form it's needed--a key to success in today's globally competitive business environment.

In short, if knowledge is power, then client/server technology can be defined as almost instant knowledge and instant power to help you achieve your business goals.

How We Got Here

Client/server systems grew out of the efforts of empowered end users frustrated with the delays in gaining access to information technology (IT) in their companies. It grew out of end users' impatience with the responsiveness (or lack of responsiveness) of IS departments that controlled, scheduled, and maintained the large systems in the "glass house." Even when end users gained access, they often concluded that the productivity gains weren't worth the investment.

So, in the 1980s, when computer prices sank while performance soared, end users began to develop their own solutions. They cobbled together hardware and operating systems and a large variety of software applications and business data. Sometimes these end users spent hour after hour manually inputting data from one program to another. For a long time, they did whatever had to be done. But it soon became clear that the products frequently wouldn't work with each other or with the data that had been accumulated. They needed help.

And so the client/server trailblazers are joining with the IT professionals experienced in IS and networks, this time determined to work together to address their major challenge: understanding and integrating the smorgasbord of tools and data on end users' desks with the expensive--but reliable and industrial-strength--assets in the glass house. Their second challenge: harnessing new and emerging technologies to make their businesses more competitive.

The newly refreshed partnership of end users and their IT departments are forging ahead, each respecting the others' needs and skills. The end users need the IT department to make their tools and data work together in a reliable network, while allowing the users the flexibility to add new tools and data as business needs demand.

Architecture: A Foundation To Build On

Almost from the beginning it was apparent that the only way they could address the challenges was by developing an IT architecture. Architecture, according to Webster's New World Compact School and Office Dictionary, is "the science or profession of designing and constructing buildings," or "a style of design and construction."

Such a definition also applies to software: one designs and constructs software within the framework of an architecture. Without a plan, software development ends up like the Winchester Mystery House in San Jose, Calif. where stairs lead nowhere and rooms have no doors. (A psychic once told the owner, widow of the Winchester Arms Co. founder, that if she ever stopped building her house, she would die.)

An interesting variation of the definition of an architecture is offered by N. Dean Meyer, president of NDMA, a consulting firm in Ridgefield, Conn. "Think of IT architecture in terms of planning a city rather than building a house," Meyer says. "Architecture provides building codes that limit near-term design options for the sake of the community, but these codes do not tell individuals what kinds of buildings they need.

"Like building codes," he continues, "an IT architecture should consist of a set of standards, guidelines, and statements of direction that permit step-by-step, business-driven implementation without sacrificing integration."1

Other proponents of standards-based IT architectures are Don Tapscott and Art Caston, co-authors of the book Paradigm Shift: The New Promise of Information Technology. They state: "An enterprise architecture is required to realize the shift to network-centric computing. The architecture defines the essential components (software, information, and technology) required, and inevitably leads to the adoption of industry standards."2

Many organizations have long realized the value of an IT architecture, especially one that is standards-based. "We recently decided to migrate toward open systems," states Antonio Gualtieri, research officer of the Kemper National Insurance Companies. "As a result, we decided to develop a unified, consistent Kemper System Architecture that is standards- rather than product-based."

In the past, architectures had been the first step in a long, involved process that took many years to plan, develop, and implement. In today's competitive environment, that's unacceptable; architecture development must be tempered with the time-driven needs to be flexible and responsive to customers' demands.


"An architecture, such as

IBM's Open Blueprint, provides

a decision-making framework,"

explains Pat Howard, a principal

at IBM Consulting Group's Chicago

office.


The Open Blueprint

IBM developed its Open Blueprint as a guide for its own transition to developing products and solutions for the open, distributed or client/server environment. IBM reviewed the Open Blueprint with some of its customers and various consultants. In addition to valuable input, IBM received an overwhelmingly positive response to its direction for open, client/server computing. In fact, it was at the urging of these reviewers that IBM increased its communications about the Open Blueprint in April 1994.

The Open Blueprint defines a set of distributed functions or services required by applications in the open, distributed environment. By incorporating de jure and de facto standards, the Open Blueprint enables the interoperability and integration of products and solutions from different suppliers. The Open Blueprint serves as a set of "building codes" for the open, distributed client/server environment (the "community").

Like any architecture, the Open Blueprint can be viewed on three levels: as a world view, as a set of technologies, and as the basis for specific products.

An architecture reveals an organization's Weltanschauung, or perspective--its "world view" of its products and strategies. For example, the Open Blueprint views the world in a transactional perspective: It includes all the mission-critical, industrial-strength principles of that world including security, reliability, and availability.

Once the perspective is known, and the functions--or the blocks in the diagram--are specified, the second level involves the nature of the technologies for each function or building block in the architecture. The choices are key, because they will indicate the degree of openness and interoperability inherent in the architecture. The technologies included in the Open Blueprint embrace open standards: for example, the Open Software Foundation's Distributed Computing Environment (DCE) and the Object Management Group's Common Object Request Broker Architecture (CORBA).

Finally, there are products that meet the organization's needs--products that exploit and use the architecture. Right now, there are many IBM and non-IBM products that are implementations of the Open Blueprint and more will come. Because they all were designed to meet certain architectural standards, products--from a wide range of suppliers--can be mixed and matched.

That's the true value of an architecture like the Open Blueprint: It describes an environment in which products can work together, and it assists other companies to create their own architectures.

You Can Get There From Here

The transition to client/server can be eased by an architecture. A solid architectural foundation can provide, for example, a common language for the many different terms and expressions used in client/server technology--which helps clarify and unify the efforts of everyone involved in the transition.

"An architecture, such as IBM's Open Blueprint, provides a decision-making framework," explains Pat Howard, a principal at IBM Consulting Group's Chicago office. "It identifies all the 'moving parts' involved in distributed, client/ server systems, and it illuminates the decision points for executives." In fact, Howard and his team have developed the "Open Blueprint Consultant," a tool that they use as part of their process to assist clients in developing architectures.

One organization helped by Howard and his "Consultant" is Resort Condominiums International (RCI). RCI is the largest travel/ leisure business in the world with principal operations in the exchange of time share properties. Responsive and reliable computing operations are fundamental to its competitive position in the business.

Dayne Dickerson, systems manager with RCI, understands the complexity and importance of designing a new architecture for today's environment. "Our Global Systems Management Team evaluated alternative methods for setting directions on our architecture," says Dickerson. "We adopted the Open Blueprint as a framework for guiding our teams, on a global basis, into the future."

In another example, a large telecommunications company engaged the IBM Consulting Group in a mobility project to help improve the effectiveness of its sales force. The Open Blueprint and architecture design process assisted the project teams in identifying improvements required in the company's existing architecture to support distributed, client/server systems for a mobile work force.

The Open Blueprint gave executives a better view of the commitment required to move toward this new paradigm. The executives involved were able to better plan the transition in sync with other planned business changes.

"The Blueprint is an important educational tool for business-oriented people," says consultant Howard. "With the crush of technology bearing down on people, they need an effective filter to help them understand, acquire, and develop applications. By understanding the technologies and standards involved in product implementation, it becomes easier to know which products will work with each other before they are installed or developed!"

Old Hands, New Potential

Skeptics may question IBM's ability to provide solutions in a client/server environment. Others, however, express the opinion that IBM, and other vendors with a long history, have a greater chance of success than do some of the newer vendors with less enterprise experience.

The "old hands at enterprise computing have the technical expertise, the implementation infrastructure, and the customer-supplier feedback loops necessary to realize the potential of open, client/server computing," said Nina Lytton recently in her Open Systems Today column3. Lytton noted with surprise that "the establishment is transforming itself faster than the new kids on the block are maturing."

As client/server consultant Judith Hurwitz summed it up in a recent PC Week article: "Client/server is perfect for IBM. It's very complex and ugly, and one thing IBM knows how to handle better than anyone else is complex and ugly."4 

References:

1. Meyer, N. Dean, "Rearchitecting the Architecture Concept," Beyond Computing, July/August 1994.

2. Tapscott, D. and Caston, A., Paradigm Shift: The New Promise of Information Technology, 1993, McGraw-Hill.

3. Lytton, Nina, "Surprise: 'Establishment' Vendors Are Earning High Marks From Client/Server Users," Open Systems Today, May 9, 1994.

4. Houston, Patrick, "True, Not Blue," PC Week, Aug. 22, 1994.

See Also:





[SQ] [tell SQ] [get SQ] ["software"]

[ IBM home page | Order | Search | Contact IBM | Help | (C) | (TM) ]