Evolution of Computing
HISTORY OF COMPUTER
Computer word is derived from “Computing”. As the start of the modern science that we call "Computer Science" can be traced back to a long ago age where man still dwelled in caves or in the forest, and lived in groups for protection and survival from the harsher elements on the Earth.
It was a man who decided when to hold both the secret and public religious ceremonies, and interceded with the spirits on behalf of the tribe. In order to correctly hold the ceremonies to ensure good harvest in the fall and fertility in the spring, the shamans needed to be able to count the days or to track the seasons. From the shamanistic tradition, man developed the first primitive counting mechanisms -- counting notches on sticks or marks on walls.
Computing becoming more and more complicated then the first computing device came in to being that is Abacus.
ABACUS
The first actual calculating mechanism known to us is the abacus, which is thought to have been invented by the Babylonians sometime between 1,000 BC and 500 BC, although some pundits are of the opinion that it was actually invented by the Chinese.
HISTORY OF COMPUTER
Computer word is derived from “Computing”. As the start of the modern science that we call "Computer Science" can be traced back to a long ago age where man still dwelled in caves or in the forest, and lived in groups for protection and survival from the harsher elements on the Earth.
Computing becoming more and more complicated then the first computing device came in to being that is Abacus.
ABACUS
The first actual calculating mechanism known to us is the abacus, which is thought to have been invented by the Babylonians sometime between 1,000 BC and 500 BC, although some pundits are of the opinion that it was actually invented by the Chinese.
Irrespective of the source, the original concept referred to a flat stone covered with sand (or dust) into which numeric symbols were drawn. The first abacus was almost certainly based on such a stone, with pebbles being placed on lines drawn in the sand. Over time the stone was replaced by a wooden frame supporting thin sticks, braided hair, or leather thongs, onto which clay beads or pebbles with holes were threaded.
A variety of different types of abacus were developed, but the most popular became those based on the bi-quinary system, which utilizes a combination of two bases (base-2 and base-5) to represent decimal numbers. Although the abacus does not qualify as a mechanical calculator, it certainly stands proud as one of first mechanical aids to calculation.
JOHN NAPIER CALCULATING DEVICE(1550-1617)
John Napier developed the logarithms rules which are very useful in mathematics and computer technology. He was a Scottish mathematical scientist. The Logarithm table is designed by Napier as well which make revolutionary change in mathematics and Computing Napier's invention led directly to the slide rule, first built in England in 1632 and still in use in the 1960's by the NASA engineers of the Mercury, Gemini, and Apollo programs which landed men on the moon. This slide rules is used to take sin, cos, tangent and other trigonometric & arithmetic calculation.
BLASE PASCAL CALCULATING DEVICE(1623-1662)
In 1642 Blasé Pascal, at age 19, invented the Pascaline as an aid for his father who was a tax collector. Pascal built 50 of this gear-driven one-function calculator (it could only add) but couldn't sell many because of their exorbitant cost and because they really weren't that accurate (at that time it was not possible to fabricate gears with the required precision).
Up until the present age when car dashboards went digital, the odometer portion of a car's speedometer used the very same mechanism as the Pascaline to increment the next wheel after each full revolution of the prior wheel. Pascal was a child prodigy. At the age of 12, he was discovered doing his version of Euclid's thirty-second proposition on the kitchen floor. Pascal went on to invent probability theory, the hydraulic press, and the syringe. Shown below is an 8 digit version of the Pascaline, and two views of a 6 digit version
Blase Pascal Calculating Device
POLYMATH GOTTFRIED
LEIBNIZ CALCULATING DEVICE
(1646-1716)
The great polymath Gottfried Leibniz was one of the first men, who dreamed for a logical (thinking) device. Even more Leibniz tried to combine principles of arithmetic with the principles of logic and imagined the computer as something more of a calculator—as a logical or thinking machine.
He discovered also that computing processes can be done much easier with a binary number coding. He even describes a calculating machine which works via the binary system: a machine without wheels or cylinders—just using balls, holes, sticks and canals for the transport of the balls.
A variety of different types of abacus were developed, but the most popular became those based on the bi-quinary system, which utilizes a combination of two bases (base-2 and base-5) to represent decimal numbers. Although the abacus does not qualify as a mechanical calculator, it certainly stands proud as one of first mechanical aids to calculation.
Up until the present age when car dashboards went digital, the odometer portion of a car's speedometer used the very same mechanism as the Pascaline to increment the next wheel after each full revolution of the prior wheel. Pascal was a child prodigy. At the age of 12, he was discovered doing his version of Euclid's thirty-second proposition on the kitchen floor. Pascal went on to invent probability theory, the hydraulic press, and the syringe. Shown below is an 8 digit version of the Pascaline, and two views of a 6 digit version
POLYMATH GOTTFRIED LEIBNIZ CALCULATING DEVICE (1646-1716)
He discovered also that computing processes can be done much easier with a binary number coding. He even describes a calculating machine which works via the binary system: a machine without wheels or cylinders—just using balls, holes, sticks and canals for the transport of the balls.
POLYMATH GOTTFRIED LEIBNIZ CALCULATING DEVICE
JOSEPH MARIE
JACQUARD CALCULATING DEVICE (1752-1834)
Joseph Marie Jacquard (1752-1834) was a French silk weaver and inventor, who improved on the original punched card design of Jacques de Vaucanson's loom of 1745, to invent the Jacquard loom mechanism in 1804-1805. Jacquard's loom mechanism is controlled by recorded patterns of holes in a string of cards, and allows, what is now known as, the Jacquard weaving of intricate patterns.
JOSEPH MARIE JACQUARD CALCULATING DEVICE (1752-1834)
CHARLES
XAVIER CALCULATING
DEVICE (1785-1870)
Charles Xavier Thomas de Colmar invented the first calculating machine to be produced in large numbers. This invention came about in France in 1820 as part of a national competition and the machine was called the Arithmometer.
CHARLES XAVIER CALCULATING DEVICE (1785-1870)
CHARLES
BABBAGE CALCULATING
DEVICE (1791-1871)
The first glimmer of a "thinking machine" came in the 1830s when British mathematician Charles Babbage envisioned what he called the analytical engine. Charles Babbage is s considered as “Father Of Computing”. Babbage was a highly regarded professor of mathematics at Cambridge University when he resigned his position to devote all of his energies to his revolutionary idea.
CHARLES BABBAGE CALCULATING DEVICE (1791-1871)
In Babbage's time, the complex mathematical tables used by ship's captains to navigate the seas, as well as many other intricate computations, had to be calculated by teams of mathematicians who were called computers.
No matter how painstaking these human computers were, their tables were often full of errors. Babbage wanted to create a machine that could automatically calculate a mathematical chart or table in much less time and with more accuracy.
His mechanical computer, designed with cogs and gears and powered by steam, was capable of performing multiple tasks by simple reprogramming—or changing the instructions given to the computer.
LADY
AUGUSTA ADA
(1816-1852)
Lady Augusta Ada is mainly known for having written a description of Charles Babbage's early mechanical general-purpose computer, the analytical engine. Ada was a US governmental developed programming language. The standard was originally known as Ada83, but this is now obsolete, as it was recently "overhauled" and re-born as Ada95. This is now the preferred standard and implementation of the Ada programming language.
No matter how painstaking these human computers were, their tables were often full of errors. Babbage wanted to create a machine that could automatically calculate a mathematical chart or table in much less time and with more accuracy.
His mechanical computer, designed with cogs and gears and powered by steam, was capable of performing multiple tasks by simple reprogramming—or changing the instructions given to the computer.
LADY AUGUSTA ADA (1816-1852)
HERMAN
HOLLERITH
(1860-1929)
Herman Hollerith developed in 1890 the punched card system to store data. The punched card system was an important movement in the development of the computer. His idea was totally different from the principle already known by Babbage or by Colmar. He used the working method of a punch cutter on the train. His calculator was so successful that he started his own business to sell his product. Later the company was called International Business Machines (IBM). However the original cards could not be used for complicated calculations.
HERMAN HOLLERITH (1860-1929)
Turing Machine – 1936
Introduced by Alan Turing in 1936, Turing machines are one of the key abstractions used in modern computability theory, the study of what computers can and cannot do. A Turing machine is a particularly simple kind of computer, one whose operations are limited to reading and writing symbols on a tape, or moving along the tape to the left or right. The tape is marked off into squares, each of which can be filled with at most one symbol. At any given point in its operation, the Turing machine can only read or write on one of these squares, the square located directly below its "read/write" head.
HERMAN HOLLERITH (1860-1929)
|
A test proposed to determine if a computer has the ability to think. In 1950, Alan Turing (Turing, 1950) proposed a method for determining if machines can think. This method is known as The Turing Test.
The test is conducted with two people and a machine. One person plays the role of an interrogator and is in a separate room from the machine and the other person. The interrogator only knows the person and machine as A and B. The interrogator does not know which the person is and which the machine is. Using a teletype, the interrogator, can ask A and B any question he/she wishes. The aim of the interrogator is to determine which the person is and which the machine is. The aim of the machine is to fool the interrogator into thinking that it is a person. If the machine succeeds then we can conclude that machines can think.
Turing Test |
Vacuum Tube – 1904
A vacuum tube is just that: a glass tube surrounding a vacuum (an area from which all gases has been removed). What makes it interesting is that when electrical contacts are put on the ends, you can get a current to flow though that vacuum.
A British scientist named John A. Fleming made a vacuum tube known today as a diode. Then the diode was known as a "valve,"
Vacuum Tube – 1904
|
The Atanasoff-Berry Computer was the world's first electronic digital computer. It was built by John Vincent Atanasoff and Clifford Berry at Iowa State University during 1937- 42. It incorporated several major innovations in computing including the use of binary arithmetic, regenerative memory, parallel processing, and separation of memory and computing functions.
ABC 1939 |
English mathematician George Boole sets up a system called Boolean algebra,, wherein logical problems are solved like algebraic problems. Boole's theories will form the bedrock of computer science.
The creation of an algebra of symbolic logic was the work of another mathematical prodigy and British individualist. . As Bertrand Russell remarked seventy years later, Boole invented pure mathematics. The design of circuits is arranged by logical statements and these statements return Zero (0) or one (1). This is called binary language.
The creation of an algebra of symbolic logic was the work of another mathematical prodigy and British individualist. . As Bertrand Russell remarked seventy years later, Boole invented pure mathematics. The design of circuits is arranged by logical statements and these statements return Zero (0) or one (1). This is called binary language.
|
Harvard Mark 1 – 1943
Howard Aiken and Grace Hopper designed the MARK series of computers at Harvard University. The MARK series of computers began with the Mark I in 1944. Imagine a giant roomful of noisy, clicking metal parts, 55 feet long and 8 feet high. The 5-ton device contained almost 760,000 separate pieces. Used by the US Navy for gunnery and ballistic calculations, the Mark I was in operation until 1959.
The computer, controlled by pre-punched paper tape, could carry out addition, subtraction, multiplication, division and reference to previous results. It had special subroutines for logarithms and trigonometric functions and used 23 decimal place numbers. Data was stored and counted mechanically using 3000 decimal storage wheels, 1400 rotary dial switches, and 500 miles of wire. Its electromagnetic relays classified the machine as a relay computer. All output was displayed on an electric typewriter. By today's standards, the Mark I was slow, requiring 3-5 seconds for a multiplication operation.
ENIAC – 1946
The computer, controlled by pre-punched paper tape, could carry out addition, subtraction, multiplication, division and reference to previous results. It had special subroutines for logarithms and trigonometric functions and used 23 decimal place numbers. Data was stored and counted mechanically using 3000 decimal storage wheels, 1400 rotary dial switches, and 500 miles of wire. Its electromagnetic relays classified the machine as a relay computer. All output was displayed on an electric typewriter. By today's standards, the Mark I was slow, requiring 3-5 seconds for a multiplication operation.
harvard mark 1 |
ENIAC I (Electrical Numerical Integrator And Calculator). The U.S. military sponsored their research; they needed a calculating device for writing artillery-firing tables (the settings used for different weapons under varied conditions for target accuracy).
John Mauchly was the chief consultant and J Presper Eckert was the chief engineer. Eckert was a graduate student studying at the Moore School when he met John Mauchly in 1943. It took the team about one year to design the ENIAC and 18 months and 500,000 tax dollars to build it.
The ENIAC contained 17,468 vacuum tubes, along with 70,000 resistors and 10,000 capacitors.
ENVAC
(1946-1952)
John Mauchly was the chief consultant and J Presper Eckert was the chief engineer. Eckert was a graduate student studying at the Moore School when he met John Mauchly in 1943. It took the team about one year to design the ENIAC and 18 months and 500,000 tax dollars to build it.
The ENIAC contained 17,468 vacuum tubes, along with 70,000 resistors and 10,000 capacitors.
ENIAC 1946 |
In 1944, while working as a research associate at the Moore School, Dr John Von Neumann worked on the EDVAC (Electronic Discrete Variable Automatic Computer), greatly advancing the functions of its predecessor. Completed in 1952, EDVAC had an internal memory for storing programs, used only 3,600 vacuum tubes, and took up a mere 490 square feet (45 sq. m).
He undertook a study of computation that demonstrated that a computer could have a simple, fixed structure, yet be able to execute any kind of computation given properly programmed control without the need for hardware modification.
He undertook a study of computation that demonstrated that a computer could have a simple, fixed structure, yet be able to execute any kind of computation given properly programmed control without the need for hardware modification.
EDSAC
(1946-1952)
EDSAC stands for Electronic Delay Storage Automatic Calculator, was an early British computer. The machine, having
been inspired by John von Neumann's seminal EDVAC report, was
constructed by Professor Sir Maurice Wilkes and his team at the
University of Cambridge Mathematical Laboratory in
England.EDSAC was the world's first practical stored program
electronic computer, although not the first stored program computer
(that honor goes to the Small-Scale Experimental Machine).The
project was supported by J. Lyons & Co. Ltd., a British firm,
who were rewarded with the first commercially applied computer,
LEO I, based on the EDSAC design. EDSAC ran its first programs
on May 6, 1949, calculating a table of squares and a list of prime
numbers.
EDSAC (1946-1952)
Transistor – 1947 been inspired by John von Neumann's seminal EDVAC report, was
constructed by Professor Sir Maurice Wilkes and his team at the
University of Cambridge Mathematical Laboratory in
England.EDSAC was the world's first practical stored program
electronic computer, although not the first stored program computer
(that honor goes to the Small-Scale Experimental Machine).The
project was supported by J. Lyons & Co. Ltd., a British firm,
who were rewarded with the first commercially applied computer,
LEO I, based on the EDSAC design. EDSAC ran its first programs
on May 6, 1949, calculating a table of squares and a list of prime
numbers.
EDSAC (1946-1952) |
smaller size
better reliability
lower power consumption
lower cost
Transistor |
Invented at the Imperial University in Tokyo by Yoshiro Nakamats.
Floppy Disk 1950 |
UNIVAC-1. The first commercially successful electronic computer, UNIVAC I, was also the first general purpose computer - designed to handle both numeric and textual information. It was designed by J. Presper Eckert and John Mauchly. The implementation of this machine marked the real beginning of the computer era.
Remington Rand delivered the first UNIVAC machine to the U.S. Bureau of Census in 1951. This machine used magnetic tape for input.
first successful commercial computer
design was derived from the ENIAC (same developers)
first client = U.S. Bureau of the Census
$1 million
48 systems built
UNIVAC 1 – 1951
|
Grace Murray Hopper an employee of Remington-Rand worked on the NUIVAC. She took up the concept of reusable software in her 1952 paper entitled "The Education of a Computer" and developed the first software that could translate symbols of higher computer languages into machine language. (Compiler).
Grace Murray Hopper |
The Advanced Research Projects Agency was formed with an emphasis towards research, and thus was not oriented only to a military product. The formation of this agency was part of the U.S. reaction to the then Soviet Union's launch of Sputnik in 1957. (ARPA draft, III-6). ARPA was assigned to research how to utilize their investment in computers via Command and Control Research (CCR). Dr. J.C.R. Licklider was chosen to head this effort.
Developed for the US DoD Advanced Research Projects Agency 60,000 computers connected for communication among research organizations and universities.
arpanet |
The 4004 was the world's first universal microprocessor. In the late 1960s, many scientists had discussed the possibility of a computer on a chip, but nearly everyone felt that integrated circuit technology was not yet ready to support such a chip. Intel's Ted Hoff felt differently; he was the first person to recognize that the new silicon-gated MOS technology might make a single-chip CPU (central processing unit) possible.
Hoff and the Intel team developed such architecture with just over 2,300 transistors in an area of only 3 by 4 millimeters. With its 4-bit CPU, command register, decoder, decoding control, control monitoring of machine commands and interim register, the 4004 was one heck of a little invention. Today's 64-bit microprocessors are still based on similar designs, and the microprocessor is still the most complex mass-produced product ever with more than 5.5 million transistors performing hundreds of millions of calculations each second - numbers that are sure to be outdated fast.
Intel 4004 – 1971 |
By 1975 the market for the personal computer was demanding a product that did not require an electrical engineering background and thus the first mass produced and marketed personal computer (available both as a kit or assembled) was welcomed with open arms. Developers Edward Roberts, William Yates and Jim Bybee spent 1973-1974 to develop the MITS (Micro Instruments Telemetry Systems ) Altair 8800. The price was $375, contained 256 bytes of memory (not 256k),but had no keyboard, no display, and no auxiliary storage device. Later, Bill Gates and Paul Allen wrote their first product for the Altair -- a BASIC compiler (named after a planet on a Star Trek episode).
Altair 8800 |
It looked like no other computer before, or for that matter, since. The Cray 1 was the world's first "supercomputer," a machine that leapfrogged existing technology when it was introduced in 1971.
And back then, you couldn't just order up fast processors from Intel. "There weren't any microprocessors," says Gwen Bell of The Computer Museum History Center. "These individual integrated circuits that are on the board performed different functions."
Each Cray 1, like this one at The Computer Museum History Center, took months to build. The hundreds of boards and thousands of wires had to fit just right. "It was really a hand-crafted machine," adds Bell. "You think of all these wires as a kind of mess, but each one has a precise length."
Cray 1 – 1 976 |
On August 12, 1981, IBM released their new computer, re-named the IBM PC. The "PC" stood for "personal computer" making IBM responsible for popularizing the term "PC".
IBM PC – 1981 |
Apple Macintosh – 1984
Apple introduced the Macintosh to the nation on January 22, 1984. The original Macintosh had 128 kilobytes of RAM, although this first model was simply called "Macintosh" until the 512K model came out in September 1984. The Macintosh retailed for $2495. It wasn't until the Macintosh that the general population really became aware of the mouse-driven graphical user interface.
|
by Neil Gershenfeld and Isaac L. Chuang Factoring a number with 400 digits--a numerical feat needed to break some security codes--would take even the fastest supercomputer in existence billions of years. But a newly conceived type of computer, one that exploits quantum-mechanical interactions, might complete the task in a year or so, thereby defeating many of the most sophisticated encryption schemes in use. Sensitive data are safe for the time being, because no one has been able to build a practical quantum computer. But researchers have now demonstrated the feasibility of this approach. Such a computer would look nothing like the machine that sits on your desk; surprisingly, it might resemble the cup of coffee at its side.
Several research groups believe quantum computers based on the molecules in a liquid might one day overcome many of the limits facing conventional computers. Roadblocks to improving conventional computers will ultimately arise from the fundamental physical bounds to miniaturization (for example, because transistors and electrical wiring cannot be made slimmer than the width of an atom). Or they may come about for practical reasons--most likely because the facilities for fabricating still more powerful microchips will become prohibitively expensive. Yet the magic of quantum mechanics might solve both these problems.
|
No comments
Thanks for the comment we will reply soon...