Вы находитесь на странице: 1из 6

History and generation of computer

Name Rahmat Ali


Reg no 04021813019
Submitted to Dr Waqar Ali
What is computer
“A computer is a machine or device that performs processes, calculations and operations based on
instructions provided by a software or hardware program. It is designed to execute applications and
provides a variety of solutions by combining integrated hardware and software components”

A computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations
automatically via computer programming. Modern computers have the ability to follow generalized sets
of operations, called programs. These programs enable computers to perform an extremely wide range of
tasks. A "complete" computer including the hardware, the operating system (main software), and
peripheral equipment required and used for "full" operation can be referred to as a computer system. This
term may as well be used for a group of computers that are connected and work together, in particular a
computer network or computer cluster. Computers are used as control systems for a wide variety of
industrial and consumer devices. This includes simple special purpose devices like microwave ovens and
remote controls, factory devices such as industrial robots and computer-aided design, and also general
purpose devices like personal computers and mobile devices such as smartphones. The Internet is run on
computers and it connects hundreds of millions of other computers and their users.

History of computer

Abacus
The history of computer begins with the birth of abacus which is believed to be the first computer. It is
said that Chinese invented Abacus around 4,000 years ago.It was a wooden rack which has metal rods
with beads mounted on them. The beads were moved by the abacus operator according to some rules to
perform arithmetic calculations. Abacus is still used in some countries like China, Russia and Japan. An
image of this tool is shown below;

Napier's Bones
It was a manually-operated calculating device which was invented by John Napier (1550-1617) of
Merchiston. In this calculating tool, he used 9 different ivory strips or bones marked with numbers to
multiply and divide. So, the tool became known as "Napier's Bones. It was also the first machine to use
the decimal point.

Pascaline
Pascaline is also known as Arithmetic Machine or Adding Machine. It was invented between 1642 and
1644 by a French mathematician-philosopher Biaise Pascal. It is believed that it was the first mechanical
and automatic calculator.Pascal invented this machine to help his father, tax accountant. It could only
perform addition and subtraction. It was a wooden box with a series of gears and wheels. When a wheel is
rotated one revolution, it rotates the neighboring wheel. A series of windows is given on the top of the
wheels to read the totals. An image of this tool is shown below;
Stepped Reckoner or Leibnitz wheel
It was developed by a German mathematician-philosopher Gottfried Wilhelm Leibnitz in 1673. He
improved Pascal's invention to develop this machine. It was a digital mechanical calculator which was
called the stepped reckoner as instead of gears it was made of fluted drums. See the following image;

Difference Engine
Analytical Engine
In the early 1820s, it was designed by Charles Babbage who is known as "Father of Modern Computer".
It was a mechanical computer which could perform simple calculations. It was a steam driven calculating
machine designed to solve tables of numbers like logarithm tables.

This calculating machine was also developed by Charles Babbage in 1830. It was a mechanical
computer that used punch-cards as input. It was capable of solving any mathematical problem
and storing information as a permanent memory.

Tabulating Machine
It was invented in 1890, by Herman Hollerith, an American statistician. It was a mechanical tabulator
based on punch cards. It could tabulate statistics and record or sort data or information. This machine was
used in the 1890 U.S. Census. Hollerith also started the Hollerith?s Tabulating Machine Company which
later became International Business Machine (IBM) in 1924.

Differential Analyzer
It was the first electronic computer introduced in the United States in 1930. It was an analog device
invented by Vannevar Bush. This machine has vacuum tubes to switch electrical signals to perform
calculations. It could do 25 calculations in few minutes.

Mark I
The next major changes in the history of computer began in 1937 when Howard Aiken planned to
develop a machine that could perform calculations involving large numbers. In 1944, Mark I computer
was built as a partnership between IBM and Harvard. It was the first programmable digital computer.

Von Neumann architecture


The von Neumann architecture—also known as the von Neumann model or Princeton architecture—is a
computer architecture based on a 1945 description by the mathematician and physicist John von Neumann
and others in the First Draft of a Report on the EDVAC.[1] That document describes a design architecture
for an electronic digital computer with these components:
A processing unit that contains an arithmetic logic unit and processor registers

A control unit that contains an instruction register and program counter

Memory that stores data and instructions

External mass storage

Input and output mechanisms[1][2]

The term "von Neumann architecture" has evolved to mean any stored-program computer in which an
instruction fetch and a data operation cannot occur at the same time because they share a common bus.
This is referred to as the von Neumann bottleneck and often limits the performance of the system.[3]

The design of a von Neumann architecture machine is simpler than a Harvard architecture machine—
which is also a stored-program system but has one dedicated set of address and data buses for reading and
writing to memory, and another set of address and data buses to fetch instructions.

A stored-program digital computer keeps both program instructions and data in read-write, random-access
memory (RAM). Stored-program computers were an advancement over the program-controlled
computers of the 1940s, such as the Colossus and the ENIAC. Those were programmed by setting
switches and inserting patch cables to route data and control signals between various functional units. The
vast majority of modern computers use the same memory for both data and program instructions. The von
Neumann vs. Harvard distinction applies to the cache architecture, not the main memory (split cache
architecture).

Generation of computer

Computer generations are based on when major technological changes in computers occurred, like the use
of vacuum tubes, transistors, and the microprocessor. As of 2018, there are five generations of the
computer.

Review each of the generations below for more information and examples of computers and technology
that fall into each generation.

Types of Generation
First generation

Second generation

Third generation

Fourth generation

Fifth generation

First generation (1940 - 1956)

The first generation of computers used vacuum tubes as a major piece of technology. Vacuum tubes
were widely used in computers from 1940 through 1956. Vacuum tubes were larger components and
resulted in first generation computers being quite large in size, taking up a lot of space in a room. Some of
the first generation computers took up an entire room.

The ENIAC is a great example of a first generation computer. It consisted of nearly 20,000 vacuum tubes,
as well as 10,000 capacitors and 70,000 resistors. It weighed over 30 tons and took up a lot of space,
requiring a large room to house it. Other examples of first generation computers include the EDSAC,
IBM 701, and Manchester Mark 1.

Second generation (1956 - 1963)


The second generation of computers saw the use of transistors instead of vacuum tubes. Transistors were
widely used in computers from 1956 to 1963. Transistors were smaller than vacuum tubes and allowed
computers to be smaller in size, faster in speed, and cheaper to build.

The first computer to use transistors was the TX-0 and was introduced in 1956. Other computers that used
transistors include the IBM 7070, Philco Transac S-1000, and RCA 501.

Third generation (1964 - 1971)


The third generation of computers introduced the use of IC (integrated circuits) in computers. Using IC's
in computers helped reduce the size of computers even more compared to second-generation computers,
as well as make them faster.
Nearly all computers since the mid to late 1960s have utilized IC's. While the third generation is
considered by many people to have spanned from 1964 to 1971, IC's are still used in computers today.
Over 45 years later, today's computers have deep roots going back to the third generation.

Fourth generation (1972 - 2010)


The fourth generation of computers took advantage of the invention of the microprocessor, more
commonly known as a CPU. Microprocessors, along with integrated circuits, helped make it possible for
computers to fit easily on a desk and for the introduction of the laptop.

Some of the earliest computers to use a microprocessor include the Altair 8800, IBM 5100, and Micral.
Today's computers still use a microprocessor, despite the fourth generation being considered to have
ended in 2010.

Fifth generation (2010 to present)


The fifth generation of computers is beginning to use AI (artificial intelligence), an exciting technology
that has many potential applications around the world. Leaps have been made in AI technology and
computers, but there is still much room for improvement.

One of the more well-known examples of AI in computers is IBM's Watson, which has been featured on
the TV show Jeopardy as a contestant. Other better-known examples include Apple's Siri on the iPhone
and Microsoft's Cortana on Windows 8 and Windows 10 computers. The Google search engine also
utilizes AI to process user searches.

Вам также может понравиться