Man first used his fingers and then his toes to count things. Gradually he needed more than his fingers and toes for the many things he had to count. With pebbles, the early man used to count his cattle. He graduated to the first calculating device with the development of the abacus by Chinese. It is a rectangular wooden frame, with beads on horizontal rods. It is today used by nursery children plodding through elementary arithmetic lessons.
From the abacus, a man moved on to the odometer, now called the speedometer, which led to the development of mechanical adders and multipliers. The next stage was John Napiers logarithm tables (1617) which simplified computing techniques and led to the development of the slide rule by Germans. Slide rule used Napier's logarithmic scales on two rules – one of which slides against the other on a groove, facilitating rapid calculation. In 1680, the slide rule was taken over by a mechanical calculator. The need to do a large amount of complex computing at increasingly faster rates was forcing a man to develop machines to do the counting for him.
FIRST MACHINE
The first mechanical calculating machine was made by a French mathematician Blaise Pascal – the machine consisted of gears, wheels, and dials, The wheels had 10 segments each and were based on the principle that when one wheel completed one rotation, the rest wheel will move one segment. His calculator could perform addition and subtraction by dialing these series of wheels, hearing numbers 0 to 9 around on them, Pascal calculator was later on modified by a German, Gotterfield Leibnitz. In 1791, Charles Babbage made a machine called ‘Differential Engines which could evaluate algebraic expressions and mathematica] tables accurately up to 20 Decimal Places. Babbage's machine could be called a Neanderthal computer in comparison to today's supercomputers. Later, automatic computing machines designed to do addition on at the rate of 80 per minute and having memory also, were developed.
Another genius in the field of computing was Dr. Hoiman Hollerith of the USA who added to the arithmetic capability of the existing calculator the ability to store intermediate results for subsequent calculations, thus eliminating copying and re-entering data. Data were entered together with the sequence of operations to be performed (which we now call program) automatically from sets of cards compiled together. Cards are even today a common medium for feeding information into computers.
EARLY COMPUTERS
The late '30s of this century saw the development of various types of computers, But all were mechanical machines. In 1948 an electrically operated computer was set up at Harvard University. After World War II, significant improvements in computer making were made. The rapid development of computers after World War can be traced to five separate stages which are normally referred to as generations.
FIRST GENERATION COMPUTER
These computers were made using values like the ones on the radio. Electronic numerical integrator and calculator popularly known as ENIAC was the first electronics-based calculators of this generation. It could perform 5000 additions/350 multiplication in a second. The first generation computers were voluminous and had slow operating speed.
SECOND GENERATION
The invention of the transistor heralded the II generation computer. Transistors are small electronic devices mode of semiconductors often Used in Place Of thermionic Valves used in radio sets. Only very low voltage is required to operate these transistors which are capable of amplifying small currents. The use of transistors instead of valves reduced the size and manufacturing cost of computers
THIRD GENERATION
The third generation saw the birth of an integrated circuit or chip—where several transistors were integrated with other components and sealed into a small package. The use of integrated circuits (IC) further reduced the size of computers. Computers of this type were called minicomputers.
FOURTH GENERATION
In third-generation computers, chipS or IC used were large and were expensive. Scientists were struck by the novel idea of placing all components of the entire computer on a single piece of a semiconductor chip. It was a daring conceptional move and after wrestling with the design the results became fruitful. Is having the entire computer circuitry on a single semiconductor chip is called a microprocessor. A computer using chips is called a microcomputer. Japanese have succeeded in procuring pocket calculators which we use. Present-day pocket calculators are the progeny of the Japanese microcomputer.
FIFTH GENERATION
A fifth-generation computer is a new super breed of computers. These contain bits of information and will be able to think and make decisions. The human brain is considered the most extraordinary computer put together over a thousand years. Questions like “Can a machine think?” and “What will be the nature of its intelligence always intrigued scientists.
MICROPROCESSOR
The invention of the microprocessor has taken man closer to producing artificial intelligence. Now an electronic chip could do everything from guiding a missile to roasting bread, yet most scientists regard them as ‘Dumb Brutes' as they do only what they are told to do. Without the kind of perception which them in the ‘effectiveness of a poem or “incongruity' of response one cannot say that intelligence is present.
More routine repetition of steps, as might be involved in adding numbers or solving equations, rely more on mechanics than on intellect. There is an ongoing race for developing ‘intelligent computer' Or popularly known as artificial intelligence. There are two areas of behavior that are both reasonable for classifying behavior as intelligent and which can be elicited from the computer—learning and reasoning. Scientists have been able to teach computers to play chess end checkers. They have been able to develop programs that can create in a computer the will to win and chalk out moves in advance. Thus a computer can't learn to play.
Many computers can learn by watching others, by reading by being told, and by trial and error. But it has some pre-programmed knowledge, which is gained by being told/fed. Computers thought or decision-making process takes place through a path of if, then, or construction i.e. – If this is true or this is false? or something else is true then do this. To take an example from our day to day life the person is 25 years old and has done M.A. in Political Science or M.A. in Sociology than the computer prints that he is eligible for such, and such a job. The thinking/decision-making procBooleandone with the help of Boolean Algebra which is a numbering system having 0 and 1 as digits.
Claude Shannon established the relationship between Boolean Algebra and the flow of power and logic. “0” represents a switch turned off and 1 a switch turned on. It has been effectively put to use to make computer reach decisions. Present-day computers are serial processors in that they proceed from point to point, one step at a time with the next step determined by the result of the previous one. Human beings in contrast use not only serial processing but also parallel processing in which several trains of thought— some conscious and others not, are underway together.
COMPUTER INTELLIGENCE
The capabilities of computers are increasing day by day at a fantastic rate which the raw human intelligence is changing slowly, if at all. Artificial Intelligence has only recently developed a sufficient track record of accomplishment to attract industrial interest. The most notable accomplishment is in carrying out a complex function normally associated with human expertise. Progress in artificial intelligence has also been noteworthy in fields such as natural language understanding and vision. A machine that can recognize and react to a human voice has long been a dream of scientists.
Speech recognition research programme scientists in Britain are making that dream seem possible. Simple systems exist but they are not true voice recognition. They range from toys that simply respond to sound – a handclap is as good as a command – to machines that recognizes only a limited vocabulary in ideal conditions. Several factors make it hard to improve matters. Accents differ between individuals and even for the same person. The IBM computer system has demonstrated an accuracy of 95 percent in comprehending a 5000-word vocabulary. The commercial sector, the military and, the major sectors of health and education fields are now exploring techniques developed by artificial intelligence researchers.
Trials are underway of a new technique called FACES (facial analysis, comparison, and elimination system) for locating photographs of offenders by computer using witness's description. By 1990′ ultra intelligence machine will be working in partnership with our best minds. The product of man's brain, it will become his salvation in a world of crushing complexities.
A computerized library in Britain is attracting interest in several countries. It can order new books and calculate costs from many foreign currency rates. It even chases up an order if it becomes overdue and keeps track of the movement of books and off the shelves into a student's briefcase or back to a bookbinder.
Source: Press Release
PIB
PRESS INFORMATION BUREAU
GOVERNMENT OF INDIA
Date: February 1, 1988