History of the Computer - Computers and Technology
The volume and use of computers in the world are so great, they have become difficult to ignore anymore. Computers appear to us in so many ways that many times, we fail to see them as they actually are. People associated with a computer when they purchased their morning coffee at the vending machine. As they drove themselves to work, the traffic lights that so often hampered us are controlled by computers in an attempt to speed the journey.
Accept it or not, the computer has invaded our life.The origins and roots of computers started out as many other inventions and technologies have in the past. They evolved from a relatively simple idea or plan designed to help perform functions easier and quicker.
The first basic type of computers were designed to do just that; compute!. They performed basic math functions such as multiplication and division and displayed the results in a variety of methods. Some computers displayed results in a binary representation of electronic lamps. Binary denotes using only ones and zeros thus, lit lamps represented ones and unlit lamps represented zeros. The irony of this is that people needed to perform another mathematical function to translate binary to decimal to make it readable to the user.One of the first computers was called ENIAC. It was a huge, monstrous size nearly that of a standard railroad car. It contained electronic tubes, heavy gauge wiring, angle-iron, and knife switches just to name a few of the components. It has become difficult to believe that computers have evolved into suitcase sized micro-computers of the 1990's.Computers eventually evolved into less archaic looking devices near the end of the 1960's. Their size had been reduced to that of a small automobile and they were processing segments of information at faster rates than older models. Most computers at this time were termed "mainframes" due to the fact that many computers were linked together to perform a given function. The primary user of these types of computers were military agencies and large corporations such as Bell, AT&T, General Electric, and Boeing. Organizations such as these had the funds to afford such technologies. However, operation of these computers required extensive intelligence and manpower resources. The average person could not have fathomed trying to operate and use these million dollar processors.The United States was attributed the title of pioneering the computer. It was not until the early 1970's that nations such as Japan and the United Kingdom started utilizing technology of their own for the development of the computer. This resulted in newer components and smaller sized computers. The use and operation of computers had developed into a form that people of average intelligence could handle and manipulate without to much ado. When the economies of other nations started to compete with the United States, the computer industry expanded at a great rate. Prices dropped dramatically and computers became more affordable to the average household.Like the invention of the wheel, the computer is here to stay.The operation and use of computers in our present era of the 1990's has become so easy and simple that perhaps we may have taken too much for granted. Almost everything of use in society requires some form of training or education. Many people say that the predecessor to the computer was the typewriter. The typewriter definitely required training and experience in order to operate it at a usable and efficient level. Children are being taught basic computer skills in the classroom in order to prepare them for the future evolution of the computer age.The history of computers started out about 2000 years ago, at the birth of the abacus, a wooden rack holding two horizontal wires with beads strung on them. When these beads are moved around, according to programming rules memorized by the user, all regular arithmetic problems can be done. Another important invention around the same time was the Astrolabe, used for navigation.Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers entered with dials and was made to help his father, a tax collector. In 1671, Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It could add, and, after changing some things around, multiply. Leibnitz invented a special stopped gear mechanism for introducing the addend digits, and this is still being used.The prototypes made by Pascal and Leibnitz were not used in many places, and considered weird until a little more than a century later, when Thomas of Colmar (A.K.A. Charles Xavier Thomas) created the first successful mechanical calculator that could add, subtract, multiply, and divide. A lot of improved desktop calculators by many inventors followed, so that by about 1890, the range of improvements included: Accumulation of partial results, storage and automatic reentry of past results (A memory function), and printing of the results. Each of these required manual installation. These improvements were mainly made for commercial users, and not for the needs of science.While Thomas of Colmar was developing the desktop calculator, a series of very interesting developments in computers was started in Cambridge, England, by Charles Babbage (of which the computer store "Babbages" is named), a mathematics professor. In 1812, Babbage realized that many long calculations, especially those needed to make mathematical tables, were really a series of predictable actions that were constantly repeated. From this he suspected that it should be possible to do these automatically. He began to design an automatic mechanical calculating machine, which he called a difference engine. By 1822, he had a working model to demonstrate. Financial help from the British Government was attained and Babbage started fabrication of a difference engine in 1823. It was intended to be steam powered and fully automatic, including the printing of the resulting tables, and commanded by a fixed instruction program.The difference engine, although having limited adaptability and applicability, was really a great advance. Babbage continued to work on it for the next 10 years, but in 1833 he lost interest because he thought he had a better idea; the construction of what would now be called a general purpose, fully program-controlled, automatic mechanical digital computer. Babbage called this idea an Analytical Engine. The ideas of this design showed a lot of foresight, although this couldn't be appreciated until a full century later.The plans for this engine required an identical decimal computer operating on numbers of 50 decimal digits (or words) and having a storage capacity (memory) of 1,000 such digits. The built-in operations were supposed to include everything that a modern general - purpose computer would need, even the all important Conditional Control Transfer Capability that would allow commands to be executed in any order, not just the order in which they were programmed.As people can see, it took quite a large amount of intelligence and fortitude to come to the 1990's style and use of computers. People have assumed that computers are a natural development in society and take them for granted. Just as people have learned to drive an automobile, it also takes skill and learning to utilize a computer.Computers in society have become difficult to understand. Exactly what they consisted of and what actions they performed were highly dependent upon the type of computer. To say a person had a typical computer doesn't necessarily narrow down just what the capabilities of that computer was. Computer styles and types covered so many different functions and actions, that it was difficult to name them all. The original computers of the 1940's were easy to define their purpose when they were first invented. They primarily performed mathematical functions many times faster than any person could have calculated. However, the evolution of the computer had created many styles and types that were greatly dependent on a well defined purpose.The computers of the 1990's roughly fell into three groups consisting of mainframes, networking units, and personal computers. Mainframe computers were extremely large sized modules and had the capabilities of processing and storing massive amounts of data in the form of numbers and words. Mainframes were the first types of computers developed in the 1940's. Users of these types of computers ranged from banking firms, large corporations and government agencies. They usually were very expensive in cost but designed to last at least five to ten years. They also required well educated and experienced manpower to be operated and maintained. Larry Wulforst, in his book Breakthrough to the Computer Age, describes the old mainframes of the 1940's compared to those of the 1990's by speculating, "...the contrast to the sound of the sputtering motor powering the first flights of the Wright Brothers at Kitty Hawk and the roar of the mighty engines on a Cape Canaveral launching pad". End of part one.
The volume and use of computers in the world are so great, they have become difficult to ignore anymore. Compute
rs appear to us in so many ways that many times, we fail to see them as they actually are. People associated with a computer when they purchased their morning coffee at the vending machine. As they drove themselves to work, the traffic lights that so often hampered us are controlled by computers in an attempt to speed the journey.
Accept it or not, the computer has invaded our life.
Accept it or not, the computer has invaded our life.
The origins and roots of computers started out as many other inventions and technologies have in the past. They evolved from a relatively simple idea or plan designed to help perform functions easier and quicker.
The first basic type of computers were designed to do just that; compute!. They performed basic math functions such as multiplication and division and displayed the results in a variety of methods. Some computers displayed results in a binary representation of electronic lamps. Binary denotes using only ones and zeros thus, lit lamps represented ones and unlit lamps represented zeros. The irony of this is that people needed to perform another mathematical function to translate binary to decimal to make it readable to the user.
The first basic type of computers were designed to do just that; compute!. They performed basic math functions such as multiplication and division and displayed the results in a variety of methods. Some computers displayed results in a binary representation of electronic lamps. Binary denotes using only ones and zeros thus, lit lamps represented ones and unlit lamps represented zeros. The irony of this is that people needed to perform another mathematical function to translate binary to decimal to make it readable to the user.
One of the first computers was called ENIAC. It was a huge, monstrous size nearly that of a standard railroad car. It contained electronic tubes, heavy gauge wiring, angle-iron, and knife switches just to name a few of the components. It has become difficult to believe that computers have evolved into suitcase sized micro-computers of the 1990's.
Computers eventually evolved into less archaic looking devices near the end of the 1960's. Their size had been reduced to that of a small automobile and they were processing segments of information at faster rates than older models. Most computers at this time were termed "mainframes" due to the fact that many computers were linked together to perform a given function. The primary user of these types of computers were military agencies and large corporations such as Bell, AT&T, General Electric, and Boeing. Organizations such as these had the funds to afford such technologies. However, operation of these computers required extensive intelligence and manpower resources. The average person could not have fathomed trying to operate and use these million dollar processors.
The United States was attributed the title of pioneering the computer. It was not until the early 1970's that nations such as Japan and the United Kingdom started utilizing technology of their own for the development of the computer. This resulted in newer components and smaller sized computers. The use and operation of computers had developed into a form that people of average intelligence could handle and manipulate without to much ado. When the economies of other nations started to compete with the United States, the computer industry expanded at a great rate. Prices dropped dramatically and computers became more affordable to the average household.
Like the invention of the wheel, the computer is here to stay.The operation and use of computers in our present era of the 1990's has become so easy and simple that perhaps we may have taken too much for granted. Almost everything of use in society requires some form of training or education. Many people say that the predecessor to the computer was the typewriter. The typewriter definitely required training and experience in order to operate it at a usable and efficient level. Children are being taught basic computer skills in the classroom in order to prepare them for the future evolution of the computer age.
The history of computers started out about 2000 years ago, at the birth of the abacus, a wooden rack holding two horizontal wires with beads strung on them. When these beads are moved around, according to programming rules memorized by the user, all regular arithmetic problems can be done. Another important invention around the same time was the Astrolabe, used for navigation.
Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers entered with dials and was made to help his father, a tax collector. In 1671, Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It could add, and, after changing some things around, multiply. Leibnitz invented a special stopped gear mechanism for introducing the addend digits, and this is still being used.
The prototypes made by Pascal and Leibnitz were not used in many places, and considered weird until a little more than a century later, when Thomas of Colmar (A.K.A. Charles Xavier Thomas) created the first successful mechanical calculator that could add, subtract, multiply, and divide. A lot of improved desktop calculators by many inventors followed, so that by about 1890, the range of improvements included: Accumulation of partial results, storage and automatic reentry of past results (A memory function), and printing of the results. Each of these required manual installation. These improvements were mainly made for commercial users, and not for the needs of science.
While Thomas of Colmar was developing the desktop calculator, a series of very interesting developments in computers was started in Cambridge, England, by Charles Babbage (of which the computer store "Babbages" is named), a mathematics professor. In 1812, Babbage realized that many long calculations, especially those needed to make mathematical tables, were really a series of predictable actions that were constantly repeated. From this he suspected that it should be possible to do these automatically. He began to design an automatic mechanical calculating machine, which he called a difference engine. By 1822, he had a working model to demonstrate. Financial help from the British Government was attained and Babbage started fabrication of a difference engine in 1823. It was intended to be steam powered and fully automatic, including the printing of the resulting tables, and commanded by a fixed instruction program.
The difference engine, although having limited adaptability and applicability, was really a great advance. Babbage continued to work on it for the next 10 years, but in 1833 he lost interest because he thought he had a better idea; the construction of what would now be called a general purpose, fully program-controlled, automatic mechanical digital computer. Babbage called this idea an Analytical Engine. The ideas of this design showed a lot of foresight, although this couldn't be appreciated until a full century later.
The plans for this engine required an identical decimal computer operating on numbers of 50 decimal digits (or words) and having a storage capacity (memory) of 1,000 such digits. The built-in operations were supposed to include everything that a modern general - purpose computer would need, even the all important Conditional Control Transfer Capability that would allow commands to be executed in any order, not just the order in which they were programmed.
As people can see, it took quite a large amount of intelligence and fortitude to come to the 1990's style and use of computers. People have assumed that computers are a natural development in society and take them for granted. Just as people have learned to drive an automobile, it also takes skill and learning to utilize a computer.
Computers in society have become difficult to understand. Exactly what they consisted of and what actions they performed were highly dependent upon the type of computer. To say a person had a typical computer doesn't necessarily narrow down just what the capabilities of that computer was. Computer styles and types covered so many different functions and actions, that it was difficult to name them all. The original computers of the 1940's were easy to define their purpose when they were first invented. They primarily performed mathematical functions many times faster than any person could have calculated. However, the evolution of the computer had created many styles and types that were greatly dependent on a well defined purpose.
The computers of the 1990's roughly fell into three groups consisting of mainframes, networking units, and personal computers. Mainframe computers were extremely large sized modules and had the capabilities of processing and storing massive amounts of data in the form of numbers and words. Mainframes were the first types of computers developed in the 1940's. Users of these types of computers ranged from banking firms, large corporations and government agencies. They usually were very expensive in cost but designed to last at least five to ten years. They also required well educated and experienced manpower to be operated and maintained. Larry Wulforst, in his book Breakthrough to the Computer Age, describes the old mainframes of the 1940's compared to those of the 1990's by speculating, "...the contrast to the sound of the sputtering motor powering the first flights of the Wright Brothers at Kitty Hawk and the roar of the mighty engines on a Cape Canaveral launching pad". End of part one.
0 Comments:
Post a Comment