Tải bản đầy đủ (.pdf) (10 trang)

An Encyclopedia of the History of Technology part 73 ppt

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (69.25 KB, 10 trang )

PART FOUR: COMMUNICATION AND CALCULATION
702
1943; it used 1500 valves (called vacuum tubes in the US). While not a stored-
program machine, Colossus may have been the world’s first true automatic
digital computer.
Credit for this invention has become a matter of dispute, much of the
development work being hidden because of the Second World War. As early
as 1931, Konrad Zuse in Germany had constructed a mechanical calculator
using binary components. This ‘Z1’ was followed by a Z2 in 1939, but it was
not until 1949 that he built the Z3 which used electromechanical relays.
In the late 1930s, Dr John V.Atanasoff sought a device to perform
mathematical calculations for twenty degree candidates at Iowa State College.
Rejecting mechanical solutions, he outlined ideas for memory and logic
elements and built the first electronic digital computer with the assistance of
Clifford Berry. It was called ABC—Atanasoff-Berry-Computer. This very early
vacuum tube system seems to have had no influence on subsequent
developments, but in 1974 a Federal judge ruled Dr Atanasoff the true inventor
of the concepts required for a working digital computer.
In 1937, George R.Stibitz at the Bell Telephone Laboratories conceived a
binary calculator using telephone relays; with S.B.Williams and other
colleagues, their Complex Computer was completed in 1939, and performed
useful work at Bell Labs until 1949. Between 1939 and 1944, Harold
H.Aitken, of Harvard University, with engineers of the International Business
Machines Corporation, built the electromechanical Automatic Sequence
Controlled Calculator, or Mark I.
Between 1942 and 1946, the valve-based Electronic Numerator, Integrator
and Computer (ENIAC) was built at the Moore School of the University of
Pennsylvania to compute ballistic tables for the US Army Ordnance Corps.
This was followed by the Electronic Discrete Variable Automatic Computer
(EDVAC), built between 1947 and 1950. Also starting about 1947, J.Presper
Eckert and John Mauchly of the Moore School conceived the Universal


Automatic Computer (UNIVAC), which was built for the US Census
Bureau and put into operation in 1951; eventually 48 UNIVAC I computers
were made. Their Eckert-Mauchly Corporation was absorbed into
Remington Rand and Sperry Rand’s Univac became the main rival to IBM
for several decades.
John von Neumann, a Hungarian immigrant at the Institute for Advanced
Study in Princeton, became a consultant to the atomic bomb project at Los
Alamos. In 1944, Herman H.Goldstein, working for the Ordnance Corps, told
von Neumann about the Moore School’s work on ENIAC, and he began an
active collaboration with them. A report written by von Neumann with
Goldstein and Arthur W.Burks of the University of Michigan, first described
the design of an electronic stored-program digital computer.
The Whirlwind computer was constructed between 1944 and 1945 at
MIT’s Lincoln Laboratory, under the direction of Jay W.Forrester. It was built
INFORMATION
703
for the US Army Air Force to simulate flight, and was the first system to use
data communications and interactive graphics.
In 1946, Alan Turing joined the UK’s National Physical Laboratory (NPL),
and immediately set about designing a computer. He presented NPL’s
executive committee with what some claim as the first detailed design for a
storedprogram electronic digital computer (others claim von Neumann).
Turing’s proposal gave a cost estimate for construction of £11,200, and
envisaged a plan for national computer development. Approval was given and
funding obtained for the Automatic Computing Engine (ACE), but by 1948
not a single component had been assembled, and Turing resigned. Only a Pilot
ACE was built, but it had a complex, 32-bit instruction format; 800 vacuum
tubes; and mercury-delay-line storage of 128 32-bit words. A few years earlier,
M.V.Wilkes and his colleagues at Cambridge built the Electronic Delay
Storage Computer (EDSAC), using ultrasonic storage; and F.C.Williams and

his group at the University of Manchester built their Mark 1 based on
Williams’s cathode-ray tube (CRT) storage.
In 1948, John Bardeen, Walter Brattain and William Shockley of the Bell
Telephone Laboratories published the results of their work on solid-state
electronic devices, which began just after the war in 1945. This paper
described the invention of the point-contact transistor, which has almost
completely displaced the valve as the active component in electronic circuits.
For this work, the three shared a Nobel Prize in 1956. Also in 1948, Claude
Shannon of Bell Labs developed his mathematical theory of communication
(also called information theory), the foundation of our understanding of digital
transmission. In 1949, Forrester invented magnetic-core storage for the
Whirlwind, which became the standard internal memory for large-scale digital
computers until semiconductor memory was introduced in the mid-1960s.
Computer operation and programing
During the 1950s, medium to large-scale systems, known as mainframes, were
introduced for business use by IBM, Univac and others; they cost from
hundreds of thousands to millions of dollars; required special housing and air-
conditioning; demanded a sizeable staff; and called for a high degree of
security. From the beginning they were operated as a closed shop; that is, users
never got near the actual hardware. Punched cards or paper tape were the
primary input media; users entered their programs and data by keypunching.
A deck of cards or roll of paper tape had to be physically delivered to the
mainframe location. Hours, sometimes days, later—if there were many big jobs
in the queue—users would get their output, usually in the form of fan-folded
paper printouts. Frequently, finding some slight mistake, they would have to
resubmit the whole job again.
PART FOUR: COMMUNICATION AND CALCULATION
704
The earliest computers had to be programmed in machine language, which
meant that both instructions and data had to be input in binary form, and the

location of each address in memory had to be specified (absolute addressing).
For humans this procedure is extremely tedious and prone to error. A big
improvement was assembly language, which assigned alphabetical codes to
each instruction, allowed the use of decimal numbers, and permitted relative
addressing. However, the relationship between the lines of code and the
machine’s memory was still one-to-one. Many users wanted to be able to
specify their instructions to the computer in a language closer to algebra, and
in 1954, John Backus of IBM published the first version of FORTRAN
(Formula Translator), the first ‘higher-level’ computer language. In 1959,
under the urging of the US Department of Defense, a standardized higher-level
language for business use was developed, known as COBOL (Common
Business-Oriented Language). By the mid-1960s, 90 per cent of all scientific
and engineering programs were being written in FORTRAN, and most
business data-processing in COBOL.
Even though personnel responsible for running mainframes believed that
they must maintain closed-shop service for security reasons, an important
advance in user convenience was made in the 1960s—remote job entry (RJE).
RJE was still batch-processing and users had to wait several hours for their
results. However, instead of having to carry decks of cards and pick up
printouts at the mainframe, they could go to a nearby RJE station where a
card reader and high-speed printer were connected by dedicated line to the
central computer.
Most mainframe computers sold to businesses were not intended to be used
interactively by anyone except their operators, who sat at a large console
composed of a video display unit (VDU) and keyboard. Operators monitored
the job-stream, and mounted magnetic tapes on which user programs and data
were stored when the VDU alerted them. Computing was still a scarce
resource, and users were charged for each task required to fulfil their job
requests. The introduction of interactive access was largely user-driven; users
put pressure on data processing departments to provide faster and easier access

than was possible under batch processing. Also, relatively low-cost terminals (a
few thousand dollars per station) had to become available to make this
economically feasible. Special applications requiring real-time processing
pioneered this development in the late 1950s and early 1960s—particularly air
defence (the US Air Force’s SAGE system) and airline reservations systems.
In 1959, Christopher Strachey in the UK proposed the idea of time-sharing
mainframe computers to make better use of their capabilities for small jobs;
each such job would be processed for only a second or two, but users, working
interactively, would not experience noticeable delays because their interaction
time is at least 1000 times slower than the computer’s. In 1961, MIT
demonstrated a model time-sharing system, and by the mid-1960s access to
INFORMATION
705
large, central computers by means of low-cost terminals connected to the
public switched telephone service had become a commercial service. Terminals
were usually Teletypes which, although designed for message communications,
were mass-produced and inexpensive; so, rather than seeing results on a CRT
screen, users would get their output printed on a roll of paper.
Within large organizations, interactive VDU terminals became available to
regular users. They resembled stripped-down operators’ consoles, and became
known as dumb terminals because they were completely dependent on the
mainframe for their operation, having no memory of their own.
Communication with the mainframe was on a polled basis via high-speed,
dedicated links, and transfer of information was in screen-sized blocks.
Paralleling the evolution of mainframes towards open access, rapid
technological developments were taking place to provide more computing
power in smaller packages for less money. In 1963, the Digital Equipment
Corporation (DEC) marketed the first minicomputer, the PDP-5, which sold
for $27,000; two years later the more powerful and smaller PDP-8 was made
available for only $18,000. This was the beginning of a new industry

oriented towards more sophisticated customers who did not need or want the
hand-holding service which was a hallmark of IBM. Such users, particularly
scientists and engineers, preferred an open system into which they could put
their own experimental devices, and develop their own applications
programs.
The micro breakthrough
This trend culminated in the development of the integrated circuit (IC),
popularly known as the microchip, or just ‘chip’, which enables digital-circuit
designers to put essential mainframe functions into one or more tiny chips of
silicon. The first patent for a microelectronic (or integrated) circuit was granted
to J.S.Kilby of the United States in 1959. By the early 1960s, monolithic
operational amplifiers were introduced by Texas Instruments and
Westinghouse. In 1962, Steven R.Hofstein and Frederick P.Heiman at the
RCA Electronic Research Laboratory made the first MOS (metal-oxide
semiconductor) chip. Then Fairchild Semiconductors came out with the 702
linear IC in 1964, the result of a collaboration between Robert Widlar and
David Talbert; soon the 709 was put on the market, one of the early success
stories in the greatest technological revolution of the twentieth century. Widlar
stressed active components in preference to passive ones. In 1964, Bryant
(Buck) Rogers at Fairchild invented the dual-in-line package (DIP), which has
come to be the universal packaging form for ICs.
In the software area, John G.Kemeny and Thomas E.Kurtz at Dartmouth
College developed a higher-level computer language for student use called
PART FOUR: COMMUNICATION AND CALCULATION
706
BASIC; from its beginning in 1964, BASIC was designed for interactive use
via ‘time-sharing’ on minicomputers. BASIC has become the standard
higherlevel language for microcomputers, often being embodied in a ROM
(read-only memory) chip, which is also known as firmware.
As with the early development of the large-scale digital computer, invention

priorities are often disputed, but, according to one source, the first electronic
calculator was introduced in 1963 by the Bell Punch Company of the UK.
This contained discrete components rather than ICs, and was produced under
licence in the US and Japan. The first American company to make electronic
calculators was Universal Data Machines, which used MOS chips from Texas
Instruments and assembled the devices with cheap labour in Vietnam and
South America. In 1965, Texas Instruments began work on their own
experimental four-function pocket calculator to be based on a single IC, and
obtained a patent on it in 1967.
Westinghouse, GT&E Laboratories, RCA and Sylvania all announced
CMOS (complementary metal-oxide semiconductor chips) in 1968. They are
more expensive, but faster and draw less current than MOS technology;
CMOS chips have made portable, battery-operated microcomputers possible.
In 1969, Niklaus Wirth of Switzerland developed a higher-level language he
called Pascal, in honour of the inventor of the first mechanical calculator.
Pascal was originally intended as a teaching language to instil good
programming style, but has become increasingly popular for general
programming use, particularly on microcomputers where it is now the main
rival to BASIC. However, Pascal is usually compiled (the entire program is
checked before it can be run), whereas BASIC is interpretive (each line is
checked as it is input).
The idea of creating a simple, low-cost computer from MOS chips—a micro-
computer-on-a-chip—seems to have occurred first to enthusiastic amateurs. In
1973, Scelbi Computer Consulting offered the Scelbi-8H as a kit for $565. It
used the Intel 8008, the first 8-bit microprocessor, and had 1 kilobyte of
memory. In January 1975, Popular Electronics magazine featured on its cover, the
Altair 8800, proclaimed as the ‘world’s first minicomputer [sic] kit to rival
commercial models’. Altair was the product of MITS, a tiny Albuquerque,
New Mexico, firm, which sold it in kit form with an Intel 8080 microprocessor
and a tiny 256 bytes of memory for $395. Users had to enter data and

instructions bit-by-bit via sixteen toggle switches, and read the results of their
computations in binary form on a corresponding array of sixteen pairs of panel
lamps, but these early machines were received enthusiastically by dedicated
‘hackers’. (To put this price in context, mainframes cost a million dollars or
more, and minicomputers tens to hundreds of thousands of dollars.)
The microcomputer market developed rapidly, even though no software
was available. Users had to develop their own applications using machine
language which was (and remains) binary. However, in June 1976, Southwest
INFORMATION
707
Technical Products Company offered the SwTPC M6800 with an editor and
assembler, and shortly afterwards with a BASIC interpreter. By the end of
1976, the floodgates had opened; it was perhaps a fitting testament to this
seminal year that Keuffel and Esser stopped production of slide-rules—donating
the last one to the Smithsonian Institution in Washington.
Microcomputers remained largely an amateur market (‘hacker’ came to be
the preferred term because users were constantly tinkering with their innards)
until the Apple II and Commodore PET came on to the market in 1977. For a
couple of thousand dollars, customers were offered an alphanumeric keyboard
for data entry and control; a colour display (their television set); a beeper which
could be programmed to play simple tunes; a pair of paddles to play video
games; a high-level language (BASIC) in ROM, and an inexpensive means of
storing and retrieving data and programs (their audiocassette recorder).
However, Apple and its rivals did not have much effect outside home and
school environments until the floppy disk was miniaturized and reduced in
price to a few hundred dollars in the late 1970s. A floppy diskette could store
up to 100 kbytes and speeded access to data and programs by a factor of
twenty. Thus, the entire contents of RAM (random-access memory), which
disappeared when the power was turned off, could now be saved in a few
seconds and loaded back quickly when needed again. Even so, the

microcomputer remained a home or consumer product, largely for games,
until useful applications programs were developed. In 1979, the VisiCalc
program was developed by Dan Bricklinn and Bob Frankston specifically for
the Apple II micro. VisiCalc enabled data to be entered on large (virtual)
speadsheets; in that form, the data could be manipulated arithmetically en
masse, with its results printed out in tabular form automatically. The business
community now found that the technology of microcomputers had
something unique to offer, and bought Apples in increasing numbers simply
to use VisiCalc.
In 1981, Adam Osborne, an Englishman who had emigrated to the US,
introduced the Osborne 1, the first complete, portable microcomputer. Selling
for less than $2000, it included a full-sized keyboard, CRT display, dual floppy
drives, and was ‘bundled’ with some of the best software available at the time:
the CP/M operating system; WordStar for word processing; SuperCalc for
financial analysis; and interpretive and compiled BASIC systems.
During all these developments, owners and operators of mainframes had
taken little notice. Microcomputers were regarded as amazing but rather
expensive toys. Mainframe services had been organized for computing
specialists, and users had to employ these specialists—systems analysts and
programmers—as intermediaries. In contrast, with only a few hours’
instruction, a non-specialist could learn a user-friendly microcomputer software
package directly relevant to his work. It soon became apparent that, because
microcomputers could be purchased without being scrutinized by a committee
PART FOUR: COMMUNICATION AND CALCULATION
708
and sent out for competitive bids—the standard procedure for items costing
hundreds of thousands of dollars or more—an employee could, and would, get
one to put on his desk. At last the mainframe masters had to take notice.
Almost from the beginning of the stored-programme digital computer, IBM
has dominated the market to a remarkable extent. However, in no sense did

they lead the introduction of micros, but entered the market only when it
showed signs of maturity. Nevertheless, the IBM PC quickly took a dominant
position in the market from its introduction in 1981.
Since this ‘legitimatization’ of micros, software of great versatility and power
has been introduced for the IBM PC and ‘clones’, as its imitators are known.
Notable among these are integrated packages, such as Lotus 1–2–3, which was
marketed in 1982. Lotus combined database, spreadsheet and graphics in a
single program. As a result of their convenience, versatility and low cost,
micros are already far more numerous than mainframes and minis, and will
account for more than half the revenue of the entire computer market in the
last half of the 1980s.
Communications between mainframe and micro
Because IBM dominates the mainframe market, many of its standards have
become ad hoc world standards. However, most of these are inappropriate or
meaningless in the microcomputer world. Mainframes use large ‘word’ sizes;
that is, each such entity they handle is usually 32 bits (even 64 bits when high-
precision computation is required); in contrast, early micros used 8-bit bytes,
and more recent ones—led by IBM’s own PC—16-bit bytes. A few micros,
particularly those emphasizing graphics, such as Apple’s Macintosh, now use
32-bit microprocessors.
Also, IBM developed its own codes to interpret the bit patterns as numbers
or letters. Their Binary-Coded Decimal (BCD) system—developed in response
to the needs of business to encode each decimal digit separately—was extended
to handle larger character sets by EBCDIC (Extended Binary Coded Decimal
Interchange Code). With BCD, only four bits were used, providing a
maximum of sixteen codes; the four codes not required for the digits 0–9 were
used for the decimal point and plus and minus symbols. However, no letters
could be represented directly. A 6-bit code was also developed which provided
64 codes, enough to handle upper-case letters, decimal digits and some special
characters. However, as text processing developed, it became apparent that

more than 64 codes would be needed. EBCDIC, IBM’s 8-bit system, provides
128 or 256 codes, allowing not just upper- and lower-case letters, but many
special symbols and codes for control (nonprinting) purposes.
At about the same time, other manufacturers got together and under the
aegis of the American Standards Association developed ASCII (American
INFORMATION
709
Standard Code for Information Interchange) which has since become a world
standard under the auspices of the International Standards Organization (ISO).
ASCII is also an 8-bit system with 256 possible codes, but assignments are
entirely different from EBCDIC. However, even IBM, when it entered the
microcomputer market, chose ASCII. Therefore, when any microcomputer-—
IBM or another make—communicates with an IBM mainframe, code
conversion is required at one end of the link or the other.
Another source of incompatibility is the type of communications used
between mainframes and their terminals. In the mid-1970s IBM developed a
system to handle remote dumb terminals based on using an ‘intelligent’ device
between communications lines and terminals: the cluster controller. A
mainframe could poll remote terminal clusters to see if they had a request, but
remote terminals could never initiate a dialogue. Also, communication links
were established to run at fairly high speeds—9600 bits per second is common—
because they had to poll every terminal on the system in turn (even if an
operator was absent). The system IBM developed for this is known as Binary
Synchronous Communications (BSC), and requires a synchronous modern
(modulator-demodulator) at each remote cluster, as well as at the mainframe.
However, when micros began to offer communications capabilities,
manufacturers chose much less expensive asynchronous modems and
protocols and, taking advantage of the built-in ‘intelligence’ of personal
computers (PCs) as terminals, let the micro initiate the conversation. Still
another source of incompatibility is that micros were first developed for home

and hobbyist markets. They provided graphics (often colour) and audio
capabilities at the time of their greatest market growth, which were essential for
games, the most popular application at that time. Most mainframes, on the
other hand, did not offer graphics and sound capability at remote terminals,
only text.
As more and more businesses encouraged the spread of micros into their
day-to-day operations, they found that their most valuable data were often
locked within mainframes. Therefore, unless someone laboriously re-entered
these data from existing printouts, another way to transfer (download) from
mainframe to micros had to be found. Also, the rapid development of micro
technology led to smaller and smaller systems, and eventually battery-operated
laptop portables. The pioneering systems were the Epson HX-20 and the
Radio Shack TRS-80 Model 100, which were introduced in 1982 and 1983.
The laptops provided essential microcomputer applications, including word
processing and telecommunications, in a briefcase-size package for less than
$1000. Users could now upload data gathered in the field to mainframes for
processing.
Laptop micros can communicate with remote systems wherever there is a
telephone by means of a type of modem which does not have to be physically
connected to the public switched telephone network (PSTN)—the acoustic
PART FOUR: COMMUNICATION AND CALCULATION
710
coupler. Rubber cups which fit over the telephone handset provide a
temporary connection. The purpose of any modem is to convert pulses, which
cannot be passed through the PSTN, to audible tones of standard modem
frequencies, which can. However, this type of communication is slow—usually
300 bits per second (bps) for acoustic couplers, and 1200 or in some cases
2400 bps for directly connected modems—and sensitive to noise and others
types of interference which increase error rates. At 300 bps, it would take
about twenty minutes to transfer all the data on a standard double-sided IBM

diskette (360 kilobytes—equivalent to about 100 typewritten pages).
The next step is to employ synchronous modems, which are used when
terminals are attached to mainframes. If an IBM PC or compatible
microcomputer is to be used, it must have a synchronous adapter card and
3270 emulation software. This hardware and software adds about $1000 to the
cost of each PC, and provides terminal emulation, but not compatible file
transfer between mainframe and micros. File transfer capability requires a
coaxial communications card in the PC, which is connected to a controller. No
modem is required at the PC, because the controller allows many devices to
share a high-speed modem. Usually there are from 8 to 32 ports on a
controller, which can support interactive terminals, remote-job-entry terminals,
remote printers—and now PCs. However, special programs are required in the
PCs, and often software must be added at the mainframe to support file
transfer.
Another problem in attaining true emulation is that the keyboard of an
IBM PC is quite different from IBM 3278 terminals, which have a large
number of special function keys. Operators who are familiar with terminal
keyboard layouts find it difficult to learn PC equivalents, and particularly
frustrating if they have to switch back and forth. With a PC emulating a 3270-
type terminal, for instance, two keys may have to be pressed simultaneously, or
even a sequence of keys struck, where one keystroke would do on the terminal
keyboard. Still another problem is that, as has been mentioned, IBM
mainframes and terminals communicate with a character code known as
EBCDIC, whereas PCs use ASCII. Therefore, EBCDIC-ASCII conversion is
required at one end or the other (it is usually done at the mainframe). The best
emulation systems permit not just whole files, but selected records or even
fields to be downloaded for processing by PC software.
THE TELEGRAPH
In the early 1600s, the Jesuit Flamianus Strada, impressed with William
Gilbert’s research on magnetism, suggested that two men at a distance might

communicate by the use of magnetic needles pointing towards letters on a dial;
George Louis Lesage, a physicist of Geneva, did some experiments along these
INFORMATION
711
lines in 1774, but electricity was still too poorly understood to provide a
practical means of communication.
Visual telegraphy
A French abbé, Claude Chappe, with his three brothers invented and put into
use the first practical system of visual (or aerial) telegraphy, the semaphore.
They had first tried, and failed, with an electrical solution. In 1790, they
succeeded in sending messages over half a kilometre, using pendulums
suspended from two posts. However, public demonstrations were greeted with
hostility by a revolutionary populace who thought the system would enable
royalists to communicate with their king. Chappe, who had spent 40,000 livres
of his own on the scheme, sought police protection for his towers.
Chappe presented his invention to the French Assembly on 22 May 1792, and
on 1 April 1793 the invention was favourably reported to the National
Convention as a war aid. However, some vital questions were raised: would it
work in fog? could secret codes be used? The Convention ordered an inquiry,
for which it appropriated 6000 livres, and after a favourable committee report,
Citizen Chappe was given the title of Telegraphy Engineer with the pay of a
lieutenant, and ordered to construct towers from Paris to Lille. The 14km (8.7
miles) distance between towers was too far for best results, and Chappe’s
business management was poor; however, the line was completed in August
1794 (Figure 15.2 (a)). Many messages were sent during construction, but the
first to be recorded after completion was on 15 August, relaying the capture of
Quesnoy from the Austrians; this was made known in Paris only one hour after
troops entered the town.
To operate a semaphore, the station agent manipulated levers from
ground level, whose movements were faithfully followed by wooden arms

placed three metres (10ft) above, on top of a stone tower. The central
member, the regulator, was 4m (13ft) long, and arms of 1.8m (6ft) were
pivoted from both ends. Regulator and arms could be placed in 196
recognizable positions in addition to those where the regulator was vertical or
horizontal, because Chappe had decided that the only meaningful signals
were when the regulator was in an inclined position. For communication at
night, a lantern was placed at each end of the arms and at each of the pivots.
There were 98 positions with the regulator inclined to the left for inter-
operator communications, and another 98 with the regulator inclined to the
right for dispatches. Chappe formulated a code of 9999 words with the
assistance of L.Delauney, and the perfected list was published with each
word represented by a number. These pairs were printed in a 92-page code
book each page of which carried 92 entries, giving the somewhat reduced
total of 8464 word-number equivalents.

×