Computer Trivia Free Test Quiz Online
computer trivia









Computer Terms and Trivia Test




Testing for Computers
Certification - Computer Career Training Assessment and Aptitude Tests


More Computer Trivia Stuff!

Compare Computer Stores Online
Considered the best computer stores on the Net!








TestsTestsTests.com Site Map
For more test like this click above...



Tell a friend!

Free Tell A Friend from Bravenet.com






The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables. The timeline of computing presents a summary list of major developments in computing by date.
Computing is intimately tied to the representation of numbers. But long before abstractions like number arose, there were mathematical concepts to serve the purposes of civilization. These concepts are implicit in concrete practices such as : *one-to-one correspondence, a rule to count how many items, say on a tally stick, which was eventually abstracted into number; - *comparison to a standard, a method for assuming reproducibility in a measurement, the number of coins, for example; - *the 3-4-5 right triangle was a device for assuring a right angle, using ropes with 12 evenly spaced knots, for example.
Eventually, the concept of numbers became concrete and familiar enough for counting to arise, at times with sing-song mnemonics to teach sequences to others. All the known languages have words for at least "one" and "two", and even some animals like the blackbird can distinguish a surprising number of items.
Advances in the numeral system and mathematical notation eventually led to the discovery of mathematical operations such as addition, subtraction, multiplication, division, squaring, square root, and so forth. Eventually the operations were formalized, and concepts about the operations became understood well enough to be stated formally, and even proven. See, for example, Euclid's algorithm for finding the greatest common divisor of two numbers.
By the High Middle Ages, the positional Hindu-Arabic numeral system had reached Europe, which allowed for systematic computation of numbers. During this period, the representation of a calculation on paper actually allowed calculation of mathematical expressions, and the tabulation of mathematical functions such as the square root and the common logarithm (for use in multiplication and division) and the trigonometric functions. By the time of Isaac Newton's research, paper or vellum was an important computing resource, and even in our present time, researchers like Enrico Fermi would cover random scraps of paper with calculation, to satisfy their curiosity about an equation. Even into the period of programmable calculators, Richard Feynman would unhesitatingly compute any steps which overflowed the memory of the calculators, by hand, just to learn the answer.
Computing hardware has been an important component of the process of calculation and data storage since it became useful for numerical values to be processed and shared. The earliest computing hardware was probably some form of tally stick; later record keeping aids include Phoenician clay shapes which represented counts of items, probably livestock or grains, in containers. Something similar is found in early Minoan excavations. These seem to have been used by the merchants, accountants, and government officials of the time.
Devices to aid computation have changed from simple recording and counting devices to the abacus, the slide rule, analog computers, and more recent electronic computers. Even today, an experienced abacus user using a device hundreds of years old can sometimes complete basic calculations more quickly than an unskilled person using an electronic calculator — though for more complex calculations, computers out-perform even the most skilled human.
This article covers major developments in the history of computing hardware, and attempts to put them in context. For a detailed timeline of events, see the computing timeline article. The history of computing article is a related overview and treats methods intended for pen and paper, with or without the aid of tables.


Back to...

Tests Tests Tests.com