name='keywords'/> COMPUTER BASICS FOR HUMAN RESOURCES PROFESSIONALS Best Blogger Tips

Since computer terminology can often be one of the biggest stumbling blocks to understanding the world of personal computers,I've tried to make things a bit easier by defining new terms at the beginning of the chapter in they first appear

Showing posts with label Minicomputer. Show all posts
Showing posts with label Minicomputer. Show all posts

Friday, 27 January 2012

Supercomputers


Specially designed systems (usually several computers tied together). Used primarily in government or research. These are the most expensive and largest systems available, and they possess tremendous computing power. The cost of operating and maintaining them makes them prohibitive hit most organizations. One example of a supercomputer Is 11w one operated by the National Security Agency. Built at a cost 
 
 FIGURE 9. Cray Supercomputer shown here with designer Seymour Cray of Cray Research, Inc.
Of approximately $15 million, it is rumored to be capable of 150 to 200 million calculations per second, and has a memory capable of transferring  320 million words per second. This system is reported to be so powerful that the heat is generates would melt is down were it not for a specially designed cooling system.*
At present, there are some 150 supercomputers (similar to that in Figure 9), in operation around the world, with most located in the United States. The latest models have a memory capacity  some two billion bytes and processing speeds 40,000 to 50,000 times faster than a personal computer. Tasks that once took a year to accomplish on a second-generation computer can be done in about a second with a supercomputer.* Mainframes. These are the large machines that come to mind when most people think of computers. Costing hundreds of thousands of dollars, and requiring specially built facilities and large supporting staffs of operators, programmers, and analysts, they are designed to handle large volumes of work or complex calculations.
Minicomputer. Smaller than a mainframe and generally costing under $200,000, these systems are ideally suited for a medium- sized organization. They require smaller facilities and less staff than the mainframes, but have enough, power to process a wide range of commercial or scientific jobs.
Microcomputer. This is where the personal computer fits in. Designed to sit on the top of a desk, and within the financial reach of most organizations and many individuals, these systems represent the latest evolutionary stage. While not yet in the same league as their larger cousins, they can easily match or outperform the computing power of their first- and second- generation ancestors.
Lap-Top Computers. An offshoot of the personal computer, these small systems (many are complete with printer and liquid crystal displays) can offer the same type of power and functionality as a desktop model. Designed for portability, they can travel inside an attach case as seen in Figure 10, and can be used just about anywhere.
While personal computers can trace their lineage back several centuries (see Exhibit 5), they are a relatively new phenomenon. The personal computer revolution really got underway in 1969 with the invention of the Intel 4004 microprocessor, which contained 2250 transistors on a single microchip. At first, these were available only to large manufacturers, but in 1971 Intel decided to clear out its stocks by offering the 4004 microprocessor for
*Philip Elmer-Dewitt, “A Sleek, Super powered Machine,” Time (June 17, 1985). 53.

















Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Comparison of Computers Then and Now

Figure 8. Comparison of computers then and now. Courtesy of International Business Machines
This process has spanned three generations of growth:
The First Generation: The 1950s. Marked by the arrival of the UNI VAC, first-generation machines are identified by their use of electronic tubes. They were generally capable of executing about 1000 instructions per second and could store no more than 20,000 characters of information. It was during this time that Admiral Grace Hopper, a pioneer of the modern computer age, began what is generally considered the first career as a computer programmer. Hopper also pioneered the development of COBOL; perhaps the most common of all computer languages. The Second Generation: 1960 to 1965. First-generation computers were considered obsolete by 1960, as transistors replaced tubes. The second-generation computers were considerably smaller than their predecessors and handled in the range of one million instructions per second. The solid state technology ad these systems increased their storage capabilities and reliability, making them more attractive to business and industry. Computer concepts, such as operating systems, time sharing, and data communications, were -refined and gained a greater use. The Third Generation: 1965 to the Present. Advances in integrated and printed circuits have spawned the current generation of computers, which are smaller, faster, have more storage capacity, and are more affordable than ever before. There are, of course, many different types of computers available for modern use.

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

THE WORLD OF PERSONAL COMPUTERS



                               Clearly, the machine no longer belonged to its makers.
            
 TRACY KIDDER,
                                                                                                            
  The Soul of a New Machine

Computers have been around a lot longer than most of us would like to believe. As a matter of fact, the computer’s lineage can be traced back to 1642 when Blaise Pascal, a French mathematical genius, invented the first real calculating machine. Pascal’s machine used a combination of rotating wheels and gears to perform simple problems of addition and subtraction.
In 1833 Charles Babbage, an English inventor, designed the great-grandfather of modern computers with the introduction of his analytical engine a forerunner of which is pictured below in Figure 3. The engine was composed of five parts:
(1) A calculating unit (the mill),
(2) The store (memory),
(3) An input device,
(4) A control section, and
(5) A printer, the system was driven by punched cards that fed the basic information into the engine, where it could be processed. Babbage also fathered the basic principles on which the first adding machine was constructed.
In 1842, Lady Augusta Ada Lovelace, a friend of Babbage, wrote the first computer documentation in her paper ‘‘Observations of Mr. Babbage Analytical Machine.' A mathematical prodigy,

FIGURE 3.  Babbage’s differential machine, a forerunner of his analytical engine, marked a major step towards the future development of computers. Smithsonian Institution photo number 53190.
Ada established herself as the world’s first computer programmer and provided the software for Babbage’s engine. In recognition of her contributions, the U.S. Department of Defense named its so-called super language after her and Ada became a registered trademark of the U.S. government. The 1840s saw the publication of several papers and theses by the English mathematician George Boole. Boole’s theories detailed how logical problems can be solved like algebraic equations. Boolean logic set the stage for the advent of computer science. In 1890, the first electronic calculating machine was invented Known as the Hollerith tabulator, it used punched cards for the first time. The United States used the Hollerith tabulator (Figure 4) to compute the census, and completed the job in a mere six weeks. Up to that time, it had taken as long as 10 years to prepare the census calculations. The era of modern computing began in 1925 at the Massachusetts Institute of Technology. There, a team of engineers led by Vannevar Bush developed a large-scale analog calculator since it was capable of storing number values electronically; this is considered the advent of-all that was to follow.

Figure 4 .Hollerith's tabulator provided a taste of future computing power when first used in figuring the results at the 1890 United States Census. Smithsonian Institution photo number 64563.
The 1930s and 1940s saw a number of advances in computer development, with the development of two of the more famous systems: ENIAC (electronic numerical integrator and computer) in the United States, and Colossus, the world’s first electronic computer, in England. Colossus was placed into operation to decipher the signals of Enigma, the German code machine. Colossus was credited with breaking Enigma’s code, which provided the necessary information to help the allies win the war. Colossus was SO secret that it was dismantled at the end of the war and only one piece is known to survive today. At the end of 1945 ENIAC arrived on the scene and solved its first problem in December of that year. The problem dealt with the hydrogen bomb, and is still considered a classified secret. The ENIAC, a portion of which is shown in Figure 5, was composed of 40 panels, each two feet wide and four feet deep, and housed some 18,000 vacuum tubes. It was capable of handling more than

FIGURE 5. ENIAC, one of the world’s first computers. Courtesy of International Business Machines.
One problem, although it had to be manually programmed by resetting switches, a process that could take up to two days.
Perhaps as a harbinger of things to come, ENIAC was obsolete almost as soon as it was running. A newer generation of stored program computers, which could be programmed electronically (instead of by recabling everything by hand), arrived in 1946 and quickly replaced ENIAC. For all its importance as one of the world’s first electronic computers, ENIAC had neither the power nor the speed of many of today’s hand-held calculators.
At that time, however, the sheer number of vacuum tubes needed to operate these early computers limited their use. Vacuum tubes were always burning out, so only short programs could be run. These machines literally filled entire rooms and were programmed at very low levels, often by a person setting and resetting row after row of switches and by recabling the system. Little wonder that a post-war government report saw little use for such machines and predicted that there might be a need for no more than three or four in the entire country. That might have been true, if the vacuum tube had remained the standard electronic core of a computer. The invention of the transistor in 1947 by Bell Laboratory scientists superseded the vacuum tube. The transistor was compact, used low voltages, and small amounts of power. It freed computers from the need be vacuum tubes and revolutionized the computer industry, setting the stage for today’s smaller computer systems. In 1951, the world’s first commercial computer, UNIVAC (Figure 6), was delivered to the Census Bureau. The UNIVAC set the trends for years to come and laid down standards that are followed even today. The original UNIVAC still blinks away at the Smithsonian Institute. Throughout the 1950s, 1960s, and 1970s, improvements in the construction of transistors opened new doors for computer man- manufacturing. The first transistors gave way to the integrated circuit, in which a number of transistors and the wiring that connects them were constructed in a single piece. Integrated circuits, turn, led to the development of wafer—thin silicon chips on which thousands of transistors can be packed into an area about one quarter of an inch square as in Fig7.


FIGURE 6. UNIVAC, the world’s first commercial computer. Smithsonian Institution photo number 72-2616.
The development of transistors and microchips led to the creation of bigger and more powerful computers. It also allowed smaller and cheaper machines to come into existence... In short, these developments led to the evolution of several distinct families of computers, as well as to a continuing decrease in the cost of computing power. In fact, since the mid-1970s, the cost of computing power has dropped by an average of 50 percent per year. A comparison of computing power then and now can be seen in Figure 8.



FIGURE 7 .  Line drawing of a microchip. Illustration by Gina Bean .




Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Thursday, 26 January 2012

THE SPREAD OF APPLICATIONS DEVELOPMENT

One chief advantage of personal computers is the improved productivity they bring to the applications development process. As anyone who has worked in, or with, data processing knows, the shortage of experienced programmers has led to large backlogs of computer applications. Simply put, people are thinking up more things for the computer to do than there are people to write the programs. The arrival of the personal computer extends the computing power necessary to achieve a lot of these applications and places it directly in the hands of those who are generating the requests,. What these people are looking for is instant productivity as a way to get around all those data processing backlogs. What some of them and their organizations are discovering is that there’s no such thing as a free lunch.
The problems fall into several categories:
People are creating programs without really thinking them through, and without giving consideration as to how their actions may be affecting others. There are a lot of duplicate programs being created, and an explosion in the number of private files and databases that are being created. People are not documenting their programs so that others can use them. Few pay attention to the need for backup and security. There is not much concern over maintaining programs once they have been created. Few people double-check their work to make sure they are doing the right things. Many- people believe that they can handle all their data processing needs simply by plugging some easy to use software into a personal computer and having at it. They tend to see personal computers as a way to avoid the long delays and other headaches of getting what they want from “those people up in data processing.” It is an unfortunate point of view to take, because it almost guarantees that they will have to learn the lessons of computers the same way all those folks in data processing did, by getting burned a few times. For example, a spreadsheet can be used to develop a budget or financial forecast with a fair degree of certainty that the columns and rows will be added correctly. Who checks to make sure that the right numbers were used, or the right formulas were applied? In data processing, people are taught to test the programs before they trust the results. A lot of computer users go with the first thing produced, or make last-minute adjustments just to see what effect they might have. The more complex the database becomes (calculate the commissions of all salespeople in the state, except those in . . .), the greater the probability of a mistake. This is particularly true of spreadsheets, which are almost seductive in nature. Information can look so good on a spreadsheet, and so authoritative, that people tend not to question it. After all, Computers rarely miscalculate anything. This same thought process contributes to other problems, such as not taking the time to prepare the documentation that tells others what the program Is and how it can be used. Employees who are trying to do nothing similar are left in a position of having to tie’ another program from scratch. It also means that when the author, leaves the organization there’s nothing to explain the program to his or her replacement. The problem is compounded as Joe creates something he’ thinks is great and shares it with Jane, who adds something and shares it with Pete, who modifies it for use with something developed by Pat. If all of this occurs without any controls or written guidelines or procedures, a major business failure could occur because of uncontrolled application development. This potential for disaster is enhanced because all too often people are not thinking about such issues as creating backups and security. In fact, most probably never will until they suffer a disaster. What data processing has learned over the past 20 years is about to be relearned by whole new groups of people. For some, learning is going to get expensive. The message won’t really be driven home, however, until someone spills coffee on the diskette containing the budget and discovers there isn’t another usable copy to be found. Maintaining programs once they have been created may also prove to be a bone of contention. A lot of people believe that what they are doing is unique, so they don’t give too much thought to what they are creating beyond producing one or two reports. A lot of people are writing what they believe are one- shot programs, and their organizations will end up living with them for years to come. Professional programmers and systems analysts learned long ago that the one-shot program that will only be used once and then forgotten doesn’t really exist. Once created, such things usually take on a life of their own. Someone has to maintain the program and the information it contains. Perhaps what data processing fears the most, and with some justification, is the creation of hundreds of new data centers throughout an organization. Since it took the data processing community some 20 years to realize the high costs and dangers of having duplicate files, duplicate data centers will remain a potential powder keg for personal computer users for some time to come. Private files and databases are being created every place a personal computer is available. Often the programs and files created exist only on a floppy disk laying on the user’s desk. No one else may know about it because there is no documentation, or they might not have access to it. If something happens to either the person or the diskette, well. One important point that everyone working with a personal computer will have to learn is that none of their actions occur in a vacuum. Everything they do has a potential consequence for someone else. In this regard, the lessons already learned by data.


Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Related Posts Plugin for WordPress, Blogger...