Since computer terminology can often be one of the biggest stumbling blocks to understanding the world of personal computers,I've tried to make things a bit easier by defining new terms at the beginning of the chapter in they first appear

Friday, 27 January 2012


                               Clearly, the machine no longer belonged to its makers.
  The Soul of a New Machine

Computers have been around a lot longer than most of us would like to believe. As a matter of fact, the computer’s lineage can be traced back to 1642 when Blaise Pascal, a French mathematical genius, invented the first real calculating machine. Pascal’s machine used a combination of rotating wheels and gears to perform simple problems of addition and subtraction.
In 1833 Charles Babbage, an English inventor, designed the great-grandfather of modern computers with the introduction of his analytical engine a forerunner of which is pictured below in Figure 3. The engine was composed of five parts:
(1) A calculating unit (the mill),
(2) The store (memory),
(3) An input device,
(4) A control section, and
(5) A printer, the system was driven by punched cards that fed the basic information into the engine, where it could be processed. Babbage also fathered the basic principles on which the first adding machine was constructed.
In 1842, Lady Augusta Ada Lovelace, a friend of Babbage, wrote the first computer documentation in her paper ‘‘Observations of Mr. Babbage Analytical Machine.' A mathematical prodigy,

FIGURE 3.  Babbage’s differential machine, a forerunner of his analytical engine, marked a major step towards the future development of computers. Smithsonian Institution photo number 53190.
Ada established herself as the world’s first computer programmer and provided the software for Babbage’s engine. In recognition of her contributions, the U.S. Department of Defense named its so-called super language after her and Ada became a registered trademark of the U.S. government. The 1840s saw the publication of several papers and theses by the English mathematician George Boole. Boole’s theories detailed how logical problems can be solved like algebraic equations. Boolean logic set the stage for the advent of computer science. In 1890, the first electronic calculating machine was invented Known as the Hollerith tabulator, it used punched cards for the first time. The United States used the Hollerith tabulator (Figure 4) to compute the census, and completed the job in a mere six weeks. Up to that time, it had taken as long as 10 years to prepare the census calculations. The era of modern computing began in 1925 at the Massachusetts Institute of Technology. There, a team of engineers led by Vannevar Bush developed a large-scale analog calculator since it was capable of storing number values electronically; this is considered the advent of-all that was to follow.

Figure 4 .Hollerith's tabulator provided a taste of future computing power when first used in figuring the results at the 1890 United States Census. Smithsonian Institution photo number 64563.
The 1930s and 1940s saw a number of advances in computer development, with the development of two of the more famous systems: ENIAC (electronic numerical integrator and computer) in the United States, and Colossus, the world’s first electronic computer, in England. Colossus was placed into operation to decipher the signals of Enigma, the German code machine. Colossus was credited with breaking Enigma’s code, which provided the necessary information to help the allies win the war. Colossus was SO secret that it was dismantled at the end of the war and only one piece is known to survive today. At the end of 1945 ENIAC arrived on the scene and solved its first problem in December of that year. The problem dealt with the hydrogen bomb, and is still considered a classified secret. The ENIAC, a portion of which is shown in Figure 5, was composed of 40 panels, each two feet wide and four feet deep, and housed some 18,000 vacuum tubes. It was capable of handling more than

FIGURE 5. ENIAC, one of the world’s first computers. Courtesy of International Business Machines.
One problem, although it had to be manually programmed by resetting switches, a process that could take up to two days.
Perhaps as a harbinger of things to come, ENIAC was obsolete almost as soon as it was running. A newer generation of stored program computers, which could be programmed electronically (instead of by recabling everything by hand), arrived in 1946 and quickly replaced ENIAC. For all its importance as one of the world’s first electronic computers, ENIAC had neither the power nor the speed of many of today’s hand-held calculators.
At that time, however, the sheer number of vacuum tubes needed to operate these early computers limited their use. Vacuum tubes were always burning out, so only short programs could be run. These machines literally filled entire rooms and were programmed at very low levels, often by a person setting and resetting row after row of switches and by recabling the system. Little wonder that a post-war government report saw little use for such machines and predicted that there might be a need for no more than three or four in the entire country. That might have been true, if the vacuum tube had remained the standard electronic core of a computer. The invention of the transistor in 1947 by Bell Laboratory scientists superseded the vacuum tube. The transistor was compact, used low voltages, and small amounts of power. It freed computers from the need be vacuum tubes and revolutionized the computer industry, setting the stage for today’s smaller computer systems. In 1951, the world’s first commercial computer, UNIVAC (Figure 6), was delivered to the Census Bureau. The UNIVAC set the trends for years to come and laid down standards that are followed even today. The original UNIVAC still blinks away at the Smithsonian Institute. Throughout the 1950s, 1960s, and 1970s, improvements in the construction of transistors opened new doors for computer man- manufacturing. The first transistors gave way to the integrated circuit, in which a number of transistors and the wiring that connects them were constructed in a single piece. Integrated circuits, turn, led to the development of wafer—thin silicon chips on which thousands of transistors can be packed into an area about one quarter of an inch square as in Fig7.

FIGURE 6. UNIVAC, the world’s first commercial computer. Smithsonian Institution photo number 72-2616.
The development of transistors and microchips led to the creation of bigger and more powerful computers. It also allowed smaller and cheaper machines to come into existence... In short, these developments led to the evolution of several distinct families of computers, as well as to a continuing decrease in the cost of computing power. In fact, since the mid-1970s, the cost of computing power has dropped by an average of 50 percent per year. A comparison of computing power then and now can be seen in Figure 8.

FIGURE 7 .  Line drawing of a microchip. Illustration by Gina Bean .

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.


  1. As I site possessor I believe the content matter here is rattling excellent , appreciate it for your hard work. You should keep it up forever! Best of luck.
    KOHLER 11374-BN Forte Sculpted Toilet Tissue Holder, Vibrant Brushed Nickel

  2. Thanks for your comment and I respect your comment and do hard work and entertain all my audience with quality content


Hello guys we are working very hard to help you to know computer basic and we are providing you the techniques which help you to know computer components to a high extend

Related Posts Plugin for WordPress, Blogger...
Twitter Bird Gadget