Only once in a lifetime will a new invention come about to touch every aspect of our lives. Such a device that changes the way we work, live, and play is a special one, indeed. A machine that has done all this and more now exists in nearly every business in the US and one out of every two households. This incredible invention is the computer. The electronic computer has been around for over a half-century, but its ancestors have been around for 2000 years. Since the beginning of time people have looked for ways to make life easier. In the fifteen century a great advancement was made the computer of course it was not as "up-to-date" as it is now but it was a start. Technology has emerged throughout many centuries to create simplicity in our way of living, using the application of ideas and inventions in the early 1600's.

For over a thousand years after the Chinese invented the abacus, not much progress was made to automate counting and mathematics. (Strathern,13‑14) The Greeks came up with numerous mathematical formulae and theorems, but all of the newly discovered math had to be worked out by hand. A mathematician was often a person who sat in the back room of an establishment with several others and they worked on the same problem. The redundant personnel working on the same problem were there to ensure the correctness of the answer. It could take weeks or months of laborious work by hand to verify the correctness of a proposed theorem. Most of the tables of integrals, logarithms, and trigonometric values were worked out this way, their accuracy unchecked until machines could generate the tables in far less time and with more accuracy than a team of humans could ever hope to achieve.(Asimov,3-4) As early as the 1640's mechanical calculators are manufactured for sale. Records exist of earlier machines, but Blaise Pascal invents the first commercial calculator, a hand powered adding machine. Although attempts to multiply mechanically were made by Gottfried Leibniz in the 1670s the first true multiplying calculator appears in Germany shortly before the American Revolution.(Asimov,9-10)

In 1801 a Frenchman, Joseph-Marie Jacquard builds a loom that weaves by reading punched holes stored on small sheets of hardwood. These plates are then inserted into the loom which reads the pattern and creates the weave. Powered by water, this "machine" came 140 years before the development of the modern computer.(Palfreman and Swade,21) Shortly after the first mass-produced calculator in1820, Charles Babbage begins his lifelong quest for a programmable machine. Although Babbage was a poor communicator and record-keeper, his difference engine is sufficiently developed by 1842 that Ada Lovelace uses it to mechanically translate a short written work. She is generally regarded as the first programmer. Twelve years later George Boole, while professor of Mathematics at Cork University, writes An Investigation of the Laws of Thought in 1854, and is generally recognized as the father of computer science. The 1890 census is tabulated on punch cards similar to the ones used 90 years earlier to create weaves. Developed by Herman Hollerith of MIT, the system uses electric power(non-mechanical). The Hollerith Tabulating Company is a forerunner of today's IBM.(Cambell and Aspray,103-107)

Computing has grown faster than almost any technology to date. For example, the ENIAC cost over $1,000,000 in 1950 approximately. These days, you can buy as much computing power for about $10. In some sense, that's equivalent to being able to buy a Jet Airplane for the price of a children's trike even more approximate.(Slater,96) Few of the original predictions of the impact of computing accommodated the implications of this enormous growth. It's not clear many of us can really conceive of the growth, or the possibilities such growth implies. A quick scan of the many early predictions about computing and related issues show things that we now consider gross misconceptions, such as: ``there will never be a need for more than ten computers nationwide''; ``no one ever needs more than two copies of a document''; ``computers can only process numbers''; ``only specialists need computers''; ``only specialists can use computers''.(Palfreman and Swade,9) The growth has also led many computer programmers to be wasteful. In 1984, a Macintosh had 128 kilobytes of memory and used 400K disks.