Distributed Processing — Past, Present and Future
by Lloyd Borrett
Technical Cornucopia, March 1991
Remember distributed processing back in the late 1970s?
Intelligence at the workstation was primitive back then. All
of the benefits of distributed processing were based on
multiple host computers, each with their own set of
One benefit of distributed processing in the '70s was
lower processing costs, based on the fact that a number of
networked minicomputers could be acquired and maintained at
a lower cost than running a centralised mainframe. Another
benefit was reduced downtime. If one host went down, then
processing could be distributed among other host computers
in the network.
At the time, more efficient operations were promised,
since it was asserted that processing could be performed at
the location of the business transactions. Even a more
stable staff of MIS professionals was promised, since people
in remote areas could become an integral part of the
information systems effort in organisations.
Now, if these notions sound primitive, or to be more
kind, a little rough around the edges, one must remember
that distributed processing over a decade ago didn't gain
the notoriety that was heralded in the media at the time.
What was missing? Intelligent workstations — machines
that facilitate the use of fourth generation languages,
graphical interfaces, and CASE tools, allowing the power of
computing to reside within the control of users.
The first attempt at distributed processing can be summed
up as follows: You can take an autocracy and replicate it.
What you end up with is more than one autocracy.
The concept of user "freedom to compute" has made
distributed processing legitimate, and intelligent
workstations are required for this concept to become
Just what is an intelligent workstation? In the
scientific and engineering community, intelligent
workstations are often high-powered 32 bit standalone units,
capable of processing mathematical computations at great
speed, while at the same time displaying high-resolution
graphics. They usually run a multitasking operating system,
such as UNIX or one of its many derivatives, and generally
allow for interactive screen control. Workstations also
support very large amounts of RAM and disk storage.
Increasingly, high powered DOS based personal computers are
also being used as traditional workstations.
This is the classic definition of a workstation. These
characteristics, as well as supported computing activities,
are independent of traditional host computing activities.
The classic scenario for computing in a host environment
is the use of dumb terminals that may communicate with one
or more host computers. Generally, there is no intelligence
in the terminal that can be accessed and manipulated by the
user. These terminals usually execute one program at a time.
They run under a multitasking operating system in a shared
processor using shared memory. A negative aspect of this
environment is that if one program loads the processor or
memory, the performance of all programs running on the
system can suffer.
An accepted definition of an intelligent workstation is
this: A device that provides processing capability in the
local input/output device. An intelligent workstation
addresses one or more CPUs, thereby taking the
characteristics of a communications platform. One of the
CPUs is dedicated to the workstation itself, whilst the
other CPUs are on remote hosts or servers.
Intelligent workstations can be described as single or
multiuser multitasking devices that support active and
passive intelligence attributes.
An intelligent workstation implies a multitasking
environment. Multitasking activities may be supported by the
local CPU and/or remote processors.
In a limited sense, a host terminal emulation card (i.e.
3270 emulation card) installed in a PC allows simultaneous
access to multiple host sessions. Sessions not currently in
view by the user may be running procedures on the host while
the user is tending to other host sessions or running
applications on the PC. Here, the multitasking environment
is supported by a remote CPU, while the local CPU
(microprocessor) is managing the emulation software as well
as other programs and data in the local device. This local
processing may itself be managed in a multitasking
An example would be to run both a local database and a
spreadsheet at the same time under Windows, on a PC based
workstation. Another partition would support the resident
terminal emulation software.
An intelligent workstation is generally a single-user
device, allowing one user to operate the input/output
(keyboard/display) functionally at any given time. However,
the local CPU, memory, and storage devices may be shared by
other workstations, as in the case of a non-dedicated LAN
Active and Passive
Intelligence can be described as both active and passive
intelligence. Active intelligence includes CPU cycles that
are under direct control of the workstation user. Personal
computing activities are the prime example of active
Passive intelligence allows CPU cycles in the workstation
to be used by other CPUs in the network. A workstation can
be a communications server, a print server, a file server,
or a database server. All can take part in network
activities independent of active intelligence applications,
and most times without the knowledge of the local user. A
variation is the dedicated database server. When coupled
with a set of active intelligent devices, one ends up with a
platform that supports the client-server model.
The functionality of intelligent workstations should be
designed to be consistent with the requirements of the
applications they'll support. External influences, such as
vendor strategies, will also impact the functionality of
intelligent workstations, providing a significant influence
on their design.
Workstation products must conform to connectivity
methods, which aren't always consistent between vendors. One
of the reasons that the PC has become so popular is the fact
that it has an open bus, and can be made to comply with
different vendors' connectivity policies through the use of
add-in board level products. Various LAN cards can be
installed, allowing the PC to attach to Ethernet, Token
Ring, or other types of LANs. Host emulation cards can be
installed allowing the PC to connect to various host
computers. The lowly RS-232 port, available in almost all
PCs as a standard connectivity device, can allow a PC to
connect to a large number of different computers.
Workstations can be diskless. A diskless workstation is a
desktop computer with no local disk drives. It reads
required data from the hard disk of a shared server. Since a
diskless workstation has its own CPU, it is by definition an
intelligent workstation. If the CPU is located in a unit
that includes support for the monitor, keyboard, RAM, video
interface, and I/O interfaces, then it is designed to be
connected to a LAN. If the CPU and other supporting
components are located remotely from the monitor and
keyboard, then the diskless workstation is supported by a
clustered CPU architecture.
An intelligent workstation may or may not be a diskless
workstation, and a diskless workstation may or may not be an
Distributed processing is directly related to intelligent
workstations in the modern day era of network computing.
This relationship has fostered models such as the
client-server database model. This model, in turn, has
fostered the introduction of application development tools
designed to optimise the distributed processing power of
Intelligent workstations and distributed processing are
the cornerstones of network computing. In fact, network
computing is about to replace host-based computing as the de
facto standard of computing models. If the cornerstones are
in place, we must examine the bricks and mortar required to
build upon them. These are the server platforms, operating
systems and application development tools.
However, choices must be made. A server platform must be
considered. Should it be the more traditional minicomputer
or the new genre of fault tolerant file servers? Which
network operating system software should be used: OS/2 LAN
Server/Manager, LAN Manager/X, Novell Netware, Appletalk or
Finally, which application software tools should be
utilised? Some organisations have been doing nothing but
waiting for the promised client-server software. Others have
been distributing data processing across networks for more
than five years.
Today, as we enter the '90s, the important issue is how
we can realise the benefits of current and future
technology. The solutions lie in the concepts of intelligent
workstations, distributed processing, and network computing.
Saturday, 15 October 2011