Вы находитесь на странице: 1из 11

A Project on

Generations of
Computers

Topic: Generations of Computers

Written & Compiled


by:

Bunty Da
An Introduction to Computer and its
generation:

Computer is an electronic device that


stores, retrieves and processes data and can be
programmed with instructions. A computer is
composed of hardware and software and can exist
in a variety of sizes and configurations.
A generation refers to the state of improvement
in the development of a product. This term is also used in the
different advancements of computer technology. With each new
generation, the circuitry has gotten smaller and more advanced
than the previous generation before it. As a result of the
miniaturization, speed, power, and memory of computers have
proportionally increased. New discoveries are constantly being
developed that affect the way we live, work and play.
The First Generation: 1940-1956 (The Vacuum Tube Years)

The first generation computers were huge, slow,


expensive, and often undependable. In 1946two Americans,
Presper Eckert, and John Mauchly built the ENIAC electronic
computer which used vacuum tubes instead of the mechanical
switches of the Mark I. The ENIAC used thousands of vacuum
tubes, which took up a lot of space and gave off a great deal of
heat just like light bulbs do. The ENIAC led to other vacuum
tube type computers like the EDVAC (Electronic Discrete
Variable Automatic Computer) and the UNIVAC -I (Universal
Automatic Computer).
The vacuum tube was an extremely important step in
the advancement of computers. Vacuum tubes were invented
the same time the light bulb was invented by Thomas Edison
and worked very similar to light bulbs. Its purpose was to act
like an amplifier and a switch. Without any moving parts,
vacuum tubes could take very weak signals and make the signal
stronger (amplify it). Vacuum tubes could also stop and start
the flow of electricity instantly (switch). These two properties
made the ENIAC computer possible.
The first generation computers used vacuum tubes
for circuitry and magnetic drums for memory, and were often
enormous, taking up entire rooms. They were very expensive to
operate and in addition to using a great deal of electricity,
generated a lot of heat, which was often the cause of
malfunctions. First generation computers relied on machine
language to perform operations, and they could only solve one
problem at a time. Input was based on punched cards and paper
tape, and output was displayed on printouts.

The UNIVAC and ENIAC computers are


examples of first-generation computing devices. The UNIVAC
was the first commercial computer delivered to a business client,
the U.S. Census Bureau in 1951.

The ENIAC gave off so much heat that they had to be


cooled by gigantic air conditioners. However even with these
huge coolers, vacuum tubes still overheated regularly. It was
time for something new.
The Second Generation: 1956-1963(The Era of the
Transistor)

The transistor computer did not last as long as the


vacuum tube computer lasted, but it was no less important in the
advancement of computer technology. In 1947 three scientists,
John Bardeen, William Shockley, and Walter Brattain working
at AT&T's Bell Labs invented what would replace the vacuum
tube forever. This invention was the transistor which functions
like a vacuum tube in that it can be used to relay and switch
electronic signals.
There were obvious differences between the
transistor and the vacuum tube. The transistor was faster, more
reliable, smaller, and much cheaper to build than a vacuum
tube. One transistor replaced the equivalent of 40 vacuum
tubes. These transistors were made of solid material, some of
which is silicon, an abundant element (second only to oxygen)
found in beach sand and glass. Therefore they were very cheap
to produce. Transistors were found to conduct electricity faster
and better than vacuum tubes. They were also much smaller and
gave off virtually no heat compared to vacuum tubes. Their use
marked a new beginning for the computer.

Transistors replaced vacuum tubes and ushered in the second


generation of computers. The transistor was

invented in 1947 but did not see widespread use in computers


until the late 50s. The transistor was far superior to the vacuum
tube, allowing computers to become smaller, faster, cheaper,
more energy-efficient and more reliable than their first-
generation predecessors. Though the transistor still generated a
great deal of heat that subjected the computer to damage, it was
a vast improvement over the vacuum tube. Second-generation
computers still relied on punched cards for input and printouts
for output.
Second-generation computers moved from cryptic binary
machine language to symbolic, or assembly, languages, which
allowed programmers to specify instructions in words. High-
level programming languages were also being developed at this
time, such as early versions of COBOL and FORTRAN. These
were also the first computers that stored their instructions in
their memory, which moved from a magnetic drum to magnetic
core technology.

The first computers of this generation were


developed for the atomic energy industry. Without this
invention, space travel in the 1960's would not have been
possible. However, a new invention would even further advance
our ability to use computers.

The Third Generation: 1964-1971 (Integrated Circuits -


Miniaturizing the Computer)

Transistors were a tremendous breakthrough in


advancing the computer. However no one could predict that
thousands even now millions of transistors (circuits) could be
compacted in such a small space. The integrated circuit, or as it
is sometimes referred to as semiconductor chip, packs a huge
number of transistors onto a single wafer of silicon. Robert

Noyce of Fairchild Corporation and Jack Kilby of Texas


Instruments independently discovered the amazing attributes of
integrated circuits. Placing such large numbers of transistors on
a single chip vastly increased the power of a single computer
and lowered its cost considerably.
Since the invention of integrated circuits, the number
of transistors that can be placed on a single chip has doubled
every two years, shrinking both the size and cost of computers
even further and further enhancing its power. Most electronic
devices today use some form of integrated circuits placed on
printed.
The development of the integrated circuit was the
hallmark of the third generation of computers. Transistors were
miniaturized and placed on silicon chips, called semiconductors,
which drastically increased the speed and efficiency of
computers. Instead of punched cards and printouts, users
interacted with third generation computers through keyboards
and monitors and interfaced with an operating system, which
allowed the device to run many different applications at one
time with a central program that monitored the memory.
Computers for the first time became accessible to a mass
audience because they were smaller and cheaper than their
predecessors.

These third generation computers could carry


out instructions in billionths of a second. The size of these
circuit boards-- thin pieces of Bakelite or fiberglass that have

electrical connections etched onto them -- sometimes called a


mother board. Machines dropped to the size of small file

cabinets. Yet, the single biggest advancement in the computer


era was yet to be discovered.
The Fourth Generation: 1971-Today (The Microprocessor)

This generation can be characterized by both the jump


to monolithic integrated circuits (millions of transistors put onto
one integrated circuit chip) and the invention of the
microprocessor (a single chip that could do all the processing of
a full-scale computer). By putting millions of transistors onto
one single chip more calculation and faster speeds could be
reached by computers. Because electricity travels about a foot
in a billionth of a second, the smaller the distance the greater the
speed of computers.
However what really triggered the tremendous
growth of computers and its significant impact on our lives is
the invention of the microprocessor. Ted Hoff, employed by
Intel (Robert Noyce's new company) invented a chip the size of
a pencil eraser that could do all the computing and logic work of
a computer. The microprocessor was made to be used in
calculators, not computers. It led, however, to the invention of
personal computers, or microcomputers.
The microprocessor brought the fourth generation
of computers, as thousands of integrated circuits were built onto
a single silicon chip. What in the first generation filled an entire
room could now fit in the palm of the hand. The Intel 4004 chip,
developed in 1971, located all the components of the computer -
from the central processing unit and memory to input/output
controls - on a single chip. In 1981 IBM introduced its first
computer for the home user, and in 1984 Apple introduced the
Macintosh. Microprocessors also moved out of the realm of
desktop computers and into many areas of life as more and more
everyday products began to use microprocessors.
As these small computers became more powerful, they could be
linked together to form networks, which eventually led to the

development of the Internet. Fourth generation computers also


saw the development of GUIs, the mouse and handheld devices.

It wasn't until the 1970's that people began


buying computer for personal use. One of the earliest personal
computers was the Altair 8800 computer kit. In 1975 you could
purchase this kit and put it together to make your own personal
computer. In 1977 the Apple II was sold to the public and in
1981 IBM entered the PC (personal computer) market.

We have all heard of Intel and its Pentium®


Processors but the beginning of the 21st century has marked
sudden long leap forward in terms of raw speed with the latest
extremely fast processors like the Intel Core 2 Duo, Core 2
Quad, Core 2 Extreme, Core i7, Core i5 series. And the rapid
developments in the software department have seen the
development of touch screen, finger-print and face detection
technologies.

Fifth Generation: Present and Beyond (Artificial


Intelligence and Massively Parallel Computers)
From Wikipedia, “The Fifth Generation Computer
Systems project (FGCS)” was an initiative by Japan's Ministry
of International Trade and Industry, begun in 1982, to create a
"fifth generation computer", which was supposed to perform
much calculation utilizing massive parallelism. It was to be the
end result of a massive government/industry research project in
Japan during the 1980s. It aimed to create an "epoch-making
computer" with supercomputer-like performance and usable

artificial intelligence capabilities." Also from Wikipedia, "The


term fifth generation was intended to convey the system as
being a leap beyond existing machines....the fifth generation, it
was widely believed at the time, would instead turn to massive
numbers of CPUs for added performance." Though the project
was canceled in 1993 with little appreciable lasting impact.

Fifth generation computing devices, based on


artificial intelligence, are still in development, though there are
some applications, such as voice recognition, that are being used
today.
Artificial Intelligence is the branch of computer
science concerned with making computers behave like humans.
The term was coined in 1956 by John McCarthyat the
Massachusetts Institute of Technology. Artificial intelligence
includes:

Games playing: programming computers to play games such


as
chess and checkers.

Expert Systems: programming computers to make decisions in


real-life situations (for example, some expert
systems help doctors diagnose diseases based
on

symptoms).

Natural Language: programming computers to understand


natural
human languages.

Neural Networks: Systems that simulate intelligence by


attempting to reproduce the types of physical
connections that occur in animal brains.

Robotics: programming computers to see and hear and react to


other sensory stimuli.
Currently, no computers exhibit full artificial
intelligence (that is, are able to simulate human behavior). The
greatest advances have occurred in the field of games playing.
The best computer chess programs are now capable of beating
humans. In May, 1997, an IBM super-computer called Deep
Blue defeated world chess champion Gary Kasparov in a chess
match.

In the area of robotics, computers are now widely


used in assembly plants, but they are capable only of very
limited tasks. Robots have great difficulty identifying objects
based on appearance or feel, and they still move and handle
objects clumsily.
Natural-language processing offers the greatest
potential rewards because it would allow people to interact with
computers without needing any specialized knowledge. You
could simply walk up to a computer and talk to it.
Unfortunately, programming computers to understand natural
languages has proved to be more difficult than originally
thought. Some rudimentary translation systems that translate
from one human language to another are in existence, but they
are not nearly as good as human translators.
There are also voice recognition systems that can
convert spoken sounds into written words, but they do not

understand what they are writing; they simply take dictation.


Even these systems are quite limited – you must speak slowly
and distinctly.

In the early 1980s, expert systems were believed


to represent the future of artificial intelligence and of computers
in general. To date, however, they have not lived up to
expectations. Many expert systems help human experts in such
fields as medicine and engineering, but they are very expensive
to produce and are helpful only in special situations.

Today, the hottest area of artificial intelligence is


neural networks, which are proving successful in a number of
disciplines such as voice recognition and natural-language
processing.

There are several programming languages that


are known as AI languages because they are used almost
exclusively for AI applications. The two most common are LISP
and Prolog.

Voice Recognition: This field of computer science deals with


designing computer systems that can recognize spoken words.
The most powerful system available can recognize up to
thousands of words. The speakers need to speak slowly and
distinctively and because of their limitations and high cost, these
systems are not often used.
The computers of the next generation will have millions upon
millions of transistors on one chip and will perform over a
billion calculations in a single second and with the addition of
artificial intelligence,
there is no end in sight for the computer movement.

---thank you
Date: 09.03.2010

-------X-------

Вам также может понравиться