lote
rom
ric
| to
rge
of
of
lal
S.
are
lar
any
. by
ies
eir
MS.
eir
es:
out
was
had
mer
be
sed
of
uch
and
age
ers
ls.
red
his program through punched cards or paper tape and controlled its operation
through console switches. Debugging was performed by single-stepping through
the program and watching the console lights and changing individual bits in
the program. Once the program was working and operational, trained operators
loaded the programs one at a time and mounted cards or tapes to provide the
data. The programs were run in a batch stream, one after the other. The
concepts of operating system and interactive user interface were
non-existent.
Stage II: 31965-1985
In time, symbolic assemblers were developed to convert mnemonic codes and
labels to machine code, and later high level languages were developed to
convert mathematical expressions into the language of the computer. As
computers became more sophisticated and popular, a profession that once
consisted entirely of scientists and engineers began attracting eager but
less computer-knowledgeable people. A buffer was needed between this new
breed of programmer and the machine. Also, to improve the efficiency of
resource utilization, it became desirable to handle more than one job at a
time. The process of assigning each job a specific amount of time, during
which it had total control of all the computer system resources, was no
longer effective as the systems became larger. Thus was born the executive,
or master program, to schedule jobs and to mediate among jobs contending for
resources. Next, the operating system evolved, together with libraries of
control and communication instruction sets, utility programs and development
tools.
The application programmer no longer had to worry about the details of
input/output or the scheduling of resources. He did not really have software
portability, as the utilities and operating system features were (and still
are) largely vendor-specific. A degree of "upward compatibility" was
provided to enable a given application to operate on the newer, more powerful
computers introduced by each vendor. Proprietary operating systems are still
the order of the day. Although UNIX continues to gain popularity, especially
in academic circles, there is still no single standard. Thus, users are
encouraged to purchase ever more powerful machines without having to rewrite
their software, provided they stay with a particular hardware manufacturer.
Stage III: 1985-
In the past, most organizations using computers found it relatively easy to
make a central decision on which manufacturer's equipment to buy. The choice
of manufacturers may not have been easy, but the need to make the decision
centrally was clear. Computers were big and expensive, required extensive
internal support and ran a limited, well-defined set of applications. Nov,
computers are smaller, less expensive, more powerful, more flexible and more
functional.
Large mainframes will continue to be needed as network controllers, central
database | managers and superproblem solvers. However, there is some
displacement taking place by networks of personal computers. It is no longer
uncommon to find a netvork of hundreds of PCs served by a single large disk
for storage and a few central printers for output.
Today the industry is obsessed with the ideas of "user friendliness" and
"industry compatibility". It is becoming increasingly important that
software products be easy to learn without the need for instruction beyond
that contained in the manuals supplied with the software. Moreover, it is
more important than ever for software to run on as many computers as
593