Вы находитесь на странице: 1из 9

Diocos, Vaneza A.

AC1-1 June 8, 2011

What is a Computer ? Computer is a programmable machine design to sequentially and automatically carry out a sequence of arithmetic or logical operations. The particular sequence of operations can be changed readily, allowing the computer to solve more than one kind of problem. - Is an apparatus built to perform routine calculations with speed, reliability and ease. - It responds to a specific set of instructions in a well-defined manner. - It can execute a prerecorded list of instructions (a program). Modern computers are electronic and digital. The actual machinery -- wires, transistors, and circuits -- is called hardware; the instructions and data are called software.

Advantages of Computer One can write effectively by means of computer. A computer allows the user to create documents, edit, print, and store them so that they can be retrieved later. Using a computer, one can remain connected to the world through Internet. Computers are widely used for education and training purposes. Computers have become a learning tool for children. Files can easily be shared between users. For students it helps a lot in projects ,research works, and assignments. Computers are useful in business , in education and home.
It helps you automate various tasks that you can not do manually.

Disadvantages of Computer The teenagers of today's society have changed dramatically due to the Computer. Internet, which is widely used to see pornographic scenes, corrupts the mind of teenagers. The Internet has also made the youth of today quite lazy, especially in terms of their education. Viruses can spread to other computers throughout a computer network. Kids with access to software that is not age appropriate may be exposed to such negative influences as violence, strong language, and over-stimulation from fast-action graphics. It may effect to the destruction of your eye sight due to radiation. It could cause violation of privacy, impact on labor force, health risks, impact on environment, distraction from work, and possible antisocial influences. It destroys your social life and interactions with humans if you do not maintain the balance.

How to take care of your Computer ? Keep your work area clean. Don't bring food or drinks near your computer. Installing a reputable anti-virus program will help rid your computer of unknown viruses. Defragmenting regularly and often helps rearrange or your file, optimizing them for faster access. Keep your pc up to date.

History of Computer The abacus This simple device was invented in China around 500 B.C., and was also used by the ancient Japanese and the Aztecs. The abacus was an early aid for mathematical computation. Its only value that is that it aids the memory of the human performing the calculation. A skilled abacus operator can work on addition and subtraction problems at the speed of a person equipped with a hand calculator (multiplication and division are slower). In 1617 an eccentric (some say mad) Scotsman named John Napier invented logarithms, which are a technology that allows multiplication to be performed via addition. But Napier also invented an alternative to tables, where the logarithm values were carved on ivory sticks which are now called Napier's Bones. Napier's invention led directly to the slide rule, first built in England in 1632 and still in use in the 1960's by the NASA engineers of the Mercury, Gemini, and Apollo programs which landed men on the moon. The first gear-driven calculating machine to actually be built was probably the calculating clock, so named by its inventor, the German professor Wilhelm Schickard in 1623. This device got little publicity because Schickard died soon afterward in the bubonic plague. In 1642 Blaise Pascal, at age 19, invented the Pascaline as an aid for his father who was a tax collector; first called Arithmetic Machine, Pascal's Calculator and later Pascaline, it could add and subtract directly and multiply and divide by repetition. Just a few years after Pascal, the German Gottfried Wilhelm Leibniz (co-inventor with Newton of calculus) managed to build a four-function (addition, subtraction, multiplication, and division) calculator that he called the stepped reckoner. Leibniz was the first to advocate use of the binary number system which is fundamental to the operation of modern computers. In 1801 the Frenchman Joseph Marie Jacquard invented a power loom that could base its weave (and hence the design on the fabric) upon a pattern automatically read from punched wooden cards, held together in a long row by rope. By 1822 the English mathematician Charles Babbage was proposing a steam driven calculating machine the size of a room, which he called the Difference Engine. This machine would be able to compute tables of numbers, such as logarithm tables. Analytic Engine This device, large as a house and powered by 6 steam engines, would be more general purpose in nature because it would be programmable, thanks to the punched card technology of Jacquard. But it was Babbage who made an important intellectual leap regarding the punched cards. Herman Hollerith, who proposed and then successfully adopted Jacquard's punched cards for the purpose of computation. Hollerith's invention, known as the Hollerith desk, consisted of a card reader which sensed the holes in the cards, a gear driven mechanism which could count (using Pascal's mechanism which we still see in car odometers), and a large wall of dial indicators (a car speedometer is a dial indicator) to display the results of the count. Hollerith built a company, the Tabulating Machine Company which, after a few buyouts, eventually became International Business Machines, known today as IBM. Harvard Mark I computer which was built as a partnership between Harvard and IBM in 1944. This was the first programmable digital computer made in the U.S. But it was not a purely electronic computer. Grace Hopper found the first computer "bug": a dead moth that had gotten into the Mark I and whose wings were blocking the reading of the holes in the paper tape. The word "bug" had been used to describe a defect since at least 1889 but Hopper is credited with coining the word "debugging" to describe the work to eliminate program faults.

In 1953 Grace Hopper invented the first high-level language, "Flow-matic". This language eventually became COBOL which was the language most affected by the infamous Y2K problem. A high-level language is designed to be more understandable by humans than is the binary language understood by the computing machinery. The microelectronics revolution is what allowed the amount of hand-crafted wiring seen in the prior photo to be mass-produced as an integrated circuit which is a small sliver of silicon the size of your thumbnail. By the early 1980s this many transistors could be simultaneously fabricated on an integrated circuit. Today's Pentium 4 microprocessor contains 42,000,000 transistors in this same thumbnail sized piece of silicon. One of the earliest attempts to build an all-electronic (that is, no gears, cams, belts, shafts, etc.) digital computer occurred in 1937 by J. V. Atanasoff, a professor of physics and mathematics at Iowa State University. By 1941 he and his graduate student, Clifford Berry . This machine was the first to store data as a charge on a capacitor, which is how today's computers store information in their main memory (DRAM or dynamic RAM). Colossus, built during World War II by Britain for the purpose of breaking the cryptographic codes used by Germany The title of forefather of today's all-electronic digital computers is usually awarded to ENIAC, which stood for Electronic Numerical Integrator and Calculator. ENIAC was built at the University of Pennsylvania between 1943 and 1945 by two professors, John Mauchly and the 24 year old J. Presper Eckert, who got funding from the war department after promising they could build a machine that would replace all the "computers", meaning the women who were employed calculating the firing tables for the army's artillery guns. Eckert and Mauchly's next teamed up with the mathematician John von Neumann to design EDVAC, which pioneered the stored program. Because he was the first to publish a description of this new computer, von Neumann is often wrongly credited with the realization that the program (that is, the sequence of computation steps) could be represented electronically just as the data was. By the end of the 1950's computers were no longer one-of-a-kind hand built devices owned only by universities and government research labs. Eckert and Mauchly left the University of Pennsylvania over a dispute about who owned the patents for their invention. They decided to set up their own company. Their first product was the famous UNIVAC computer, the first commercial (that is, mass produced) computer. In the 50's, UNIVAC (a contraction of "Universal Automatic Computer") was the household word for "computer" just as "Kleenex" is for "tissue". The first UNIVAC was sold, appropriately enough, to the Census bureau. UNIVAC was also the first computer to employ magnetic tape. In IBM's case it was their own decision to hire an unknown but aggressive firm called Microsoft to provide the software for their personal computer (PC). This lucrative contract allowed Microsoft to grow so dominant that by the year 2000 their market capitalization (the total value of their stock) was twice that of IBM and they were convicted in Federal Court of running an illegal monopoly. There were 2 ways to interact with a mainframe. The first was called time sharing because the computer gave each user a tiny sliver of time in a round-robin fashion. A teletype was a motorized typewriter that could transmit your keystrokes to the mainframe and then print the computer's response on its roll of paper. The alternative to time sharing was batch mode processing, where the computer gives its full attention to your program. In exchange for getting the computer's full attention at run-time, you had to agree to prepare your program off-line on a key punch machine which generated punch cards. By the 1990's a university student would typically own his own computer and have exclusive use of it in his dorm room. This transformation was a result of the invention of the microprocessor. A microprocessor (uP) is a computer that is fabricated on an integrated circuit (IC). Computers had been

around for 20 years before the first microprocessor was developed at Intel in 1971. Intel didn't invent the electronic computer. But they were the first to succeed in cramming an entire computer on a single chip (IC). Intel was started in 1968 and initially produced only semiconductor memory (Intel invented both the DRAM and the EPROM, two memory technologies that are still going strong today). In 1969 they were approached by Busicom, a Japanese manufacturer of high performance calculators (these were typewriter sized units, the first shirt-pocket sized scientific calculator was the Hewlett-Packard HP35 introduced in 1972). Busicom wanted Intel to produce 12 custom calculator chips: one chip dedicated to the keyboard, another chip dedicated to the display, another for the printer, etc. But integrated circuits were (and are) expensive to design and this approach would have required Busicom to bear the full expense of developing 12 new chips since these 12 chips would only be of use to them. But a new Intel employee (Ted Hoff) convinced Busicom to instead accept a general purpose computer chip which, like all computers, could be reprogrammed for many different tasks (like controlling a keyboard, a display, a printer, etc.). Intel argued that since the chip could be reprogrammed for alternative purposes, the cost of developing it could be spread out over more users and hence would be less expensive to each user. The general purpose computer is adapted to each new purpose by writing a program which is a sequence of instructions stored in memory (which happened to be Intel's forte). The Intel 4004, the first microprocessor (uP). The 4004 consisted of 2300 transistors and was clocked at 108 kHz (i.e., 108,000 times per second). Compare this to the 42 million transistors and the 2 GHz clock rate (i.e., 2,000,000,000 times per second) used in a Pentium 4. One of Intel's 4004 chips still functions aboard the Pioneer 10 spacecraft, which is now the man-made object farthest from the earth. Curiously, Busicom went bankrupt and never ended up using the ground-breaking microprocessor. Intel followed the 4004 with the 8008 and 8080. The 8080 was employed in the MITS Altair computer, which was the world's first personal computer (PC). A Harvard freshman by the name of Bill Gates decided to drop out of college so he could concentrate all his time writing programs for this computer. This early experienced put Bill Gates in the right place at the right time once IBM decided to standardize on the Intel microprocessors for their line of PCs in 1981. The Intel Pentium 4 used in today's PCs is still compatible with the Intel 8088 used in IBM's first PC. 1982 Peter Norton creates Norton Utilities.

History of Computer Abacus known as the first invented manual data processing device. The abacus is a useful manual mathematical computer. A well trained abacus operator can perform addition and subtraction problem faster than a person equipped with a hand calculator but slower in multiplication and division. The oldest surviving abacus was used in 300 B.C. by the Babylonians but it is often wrongly attributed to China. John Napier A Scottish mathematician who is known for his invention of logarithm in early 1600s, a technology that allows multiplication to be computed through addition. It is possible through applying the logarithm of each operand, which was originally obtained from a printable table. Napiers Bone A device developed by John Napier. It consists of a set of eleven rods made of ivory sticks with numbers carved on them. It can perform multiplication and division by simply placing the rods side by side. William Oughtred An English mathematician who developed the slide rule. Oughtreds Slide Rule It was first built in England. It consists of two movable rulers placed side by side and by sliding the rulers you can quickly obtain the product and quotient of a numbers. Blaise Pascal A seventeenth-century French mathematician and scientist. He was one of the first modern scientist to develop and build a calculator. In mid 1600s he invented the Pascaline as an aid for his father who was a tax collector. Pascaline or Pascals calculator A device that could perform addition and subtraction of a numbers of up to eight digits. A Pascaline consisted of geras and cylinders which rotated to display the numerical result. Gottfried Wilhelm von Leibnitz A German scientist and co-inventor with Newton managed to build a calculator that could perform the four basic functions: addition, subtraction, multiplication, and division. Leibnitz was the first to advocate the use of the fundamentals to the operation of modern computers, the binary number system. Leibnitzs Calculator Considered as the modified version of the work of Pascals Calculator, it uses the same concept in adding and subtracting numbers. It can also perform multiplication and division and extract square roots of a numbers. Charles Babbage An English mathematician of the nineteenth-century who proposed a steamdriven calculating machine around 1800. He was considered to be the Father of the Modern Computer. Babbages Differential Engine This machine would be able to compute tables of numbers, such as logarithm tables and was designed to automate a standard procedure for calculating roots of polynomials. But unfortunately, the construction of Babbages Difference Engine was very complicated and very expensive. Making it was the most expensive government-funded project during that time. After ten years, the device remained incomplete until funding dried up and it was abandoned. Babbages Analytical Engine After abandoning the differential Engine, Babbage designed a more powerful mechanical computing device, the Analytical Engine. The device had two main parts, the Store and the Mill as Babbage called it. Both terms are used in the weaving industry. Numbers were held in the Store and the Mill was where they were woven into new result. These two main parts in modern computer was called the memory unit and the

central processing unit ( CPU ). Unfortunately, he couldnt get funding to develop the precisely machined gears, wheels, and lever systems of the machine. Although he was never able to build the device, his ideas included many concepts and features that were later incorporated in present computers. Augusta Ada Byron, Lady Lovelace The daughter of the illustrious poet Lord Byron born in December 10, 1815., Ada worked with Babbage. Ada suggested to Babbage, writing a plan for how the Analytical Engine might calculate Bernoulli numbers, she wrote a series of Notes where she demonstrated a sequence of instructions she had prepared for the Analytical Engine. This plan, is now regarded as the first computer program. That is why many refer to her as the First Programmer. In her honor in late 1900, a software language developed by the U.S. Defense was named Ada. Herman Hollerith A statistician with the US Bureau of the Census. The census bureau offered a prize for an inventor to help process the results of the 1890 census and this was won by Herman Hollerith, who completed this machine and who successfully adopted Jacquards punched cards for the purpose of computation. Holleriths Punched Card An electromagnetic counting machine invented by Herman Hollerith. It used punch cards to sort the data manually and tabulate the data during the 1890 US census. It has a card reader which senses the holes in the cards, a gear-driven mechanism for counting, and displays the result on a large wall of dial indicators. Mark I Developed by Howard Aiken. - The official name is Automatic Sequence Controlled Calculator. - Approximately 50 feet long and 8 feet high. - Could perform the four basic arithmetic operations. - Could locate information stored in tabular form. - Process numbers up to 23 digits. - Could multiply three eight-digit numbers in one second. ENIAC (Electronic Numerical Integrator and Calculator) - Developed by John Presper Echert Jr. and John Mauchly. - The first large-scale vacuum tube computer. - The first general-purpose digital electronic computer that performed a variety of computational tasks. - It was originally built for the US military to calculate ballistic tables to aim their big guns. - It consisted of over 18,000 vacuum tubes, used 200 kilowatts of electrical power and required manual setting switches to achieve desired results. - Could perform 300 multiplications per second/ - It was a thousand times faster than the best mechanical calculators. - Its memory could only store 20 ten-digits numbers. EDVAC (Electronic Discrete Variable Automatic Computer) - Developed by John von Neumann. - Employed binary arithmetic. - Stored-program capability. - Had a permanent wiring of sets of instructions within the computer and these operations were placed under a central control. EDSAC (Electronic Delay Storage Automatic Calculator) - Built by Maurice V. Wilkes and his team at the university of Cambridge in England and completed in 1949. - It was one of the first stored-program machine computers and one of the first to use binary digits.

It consisted of 512 36-bit words of liquid mercury delay lines memory, and its input and output were provided by paper tape. The EDSAC could do about 700 additions per second and 200 multiplications per second.

UNIVAC (Universal Automatic Computer) - Developed by George Gray Remington Rand Corporation. - Manufactured as the first commercially available first generation computer worldwide. - In 1947, John Mauchly chose the name UNIVAC for his companys product. - UNIVAC was designed by J. Presper Eckert and John Mauchly (designers of the ENIAC). IBM (International Business Machines) - By 1960, IBM was the dominant force in the market of large mainframe computers. - IBM sometimes refers to the IBM 650as its first computer, although it is predated by at least ASCC (1943) and SSEC (1947), which were not products, and the 701 (1952), which definitely was. IBM 701 it is more accurate to call it IBM;s first commercial business computer ( since the 701 was intended for scientific use ), and the first computer to make a meaningful profit. In any case, the IBM 650 was the first general-purpose computer to be installed and used at Columbia University. First Generation Computer ( 1951 1059 ) Consisted of vacuum tubes for storing data in memory and used stored-program. Vacuum tubes consume lots of electrical power and are prone to burning out, which caused problems for early computers that used thousand of them.

Second Generation Computer ( 1959 1963 ) - Consisted of solid-state transistors and diodes. - Transistors replaced the vacuum tube as the electrical switching device in computers. The transistor ( developed at Bell Labs by William Shockley and others in the 1950s ) was a solid-state semiconductor device typically made of silicon or germanium. It was much smaller, much more reliable, and consumed much less energy than a vacuum tube. Third Generation Computer ( 1963 1975 ) - Integrated solid-state circuitry, the Integrated circuit. - Integrated Circuit (IC) was invented by Jack Kirby and Robert Noyce. - An integrated circuit incorporates many transistors and other electrical components, all formed into a miniature circuit onto a single chip of silicon. - The invention of the integrated circuit allowed computers to become even smaller, with the whole central processing unit (CPU) of the computer fitting onto one circuit board. These minicomputers were cheaper than smaller than a mainframe ( the computer was roughly the size of a drawer in a large filing cabinet ). Fourth Generation Computer ( 1975 present ) - Microprocessors - Multiprocessing, multiprogramming, miniaturization, time-sharing, operating speed, and virtual storage. Fifth Generation(?) - Artificial Intelligence (AI), Virtual, Reality, Expert System - Deals with intelligent behavior, learning, and adaptation in machines. - AI is concerned with producing machines to automate tasks requiring intelligent behavior.

Вам также может понравиться