name='keywords'/> January 2012 | COMPUTER BASICS FOR HUMAN RESOURCES PROFESSIONALS Best Blogger Tips

Since computer terminology can often be one of the biggest stumbling blocks to understanding the world of personal computers,I've tried to make things a bit easier by defining new terms at the beginning of the chapter in they first appear

Monday, 30 January 2012


All the activities a computer performs are guided by a program. A program is  simply a set of instructions or commands that tells

 FIGURE 17. A personal computer with a 10 megabyte hard disk can store the equivalent of 3333 pages of text.

FIGURE 18. Three types of computer memory and storage devices. A microchip (64K). Diskette (320K) and hard disk (10 megabytes +). 

The computer how to perform a specific job. You may write your own programs using a programming language, or by purchasing a preprogrammed software package. Such packages will do spreadsheets, word processing, database management, communications, tax preparation, inventory, project management, games, household budgets, and hundreds of other tasks. While a person working with a personal computer might only make a single entry, like entering an amount into a checkbook register, the program that pulls everything together into something meaningful might actually have several steps. From the computer’s point of view, the program of instructions might look something like this:

I - Take the contents of memory location 5.
2. Add this to the contents of memory location 15.
3 Compare this to the contents of memory location 17.
4. If the contents are the same, store the results in memory location 21.
5. If the contents are not the same input a warning to the system monitors.
Programming languages
As noted earlier, computers work only with binary information. This machine language is really nothing more than a series of Os and Is strung together. Machine language, however, is a difficult way for most people to communicate with a computer. To simplify things, a number of high-level programming languages were created. These are much simpler to use and understand because of their similarity to the everyday language of human communications. These programming languages are translated into a code the computer understands, using the firmware installed in it by the manufacturer.
Among the most widely used languages are:
FORTRAN: (Formula Translator). It is used primarily in medical, scientific, and technical applications.
COBOL: (Common Business Oriented Language). It is one of the most popular business and accounting languages.
BASIC: (Beginners All Purpose Symbolic Instruction Code). This is as close to English as current programming languages come. BASIC is supplied with many personal computers, by the manufacturers.
PASCAL: (Named for the Seventeenth-Century French Mathematician Blaise Pascal). This is a popular programming language with many microcomputer users, who consider it easy to learn.
No matter which language a program is written in, the languages generally share four fundamental activities. They:
1. Provide a method for getting something into the computer, and getting it out again
2. Make comparisons
3. Decide what activities need to be performed
4. Repeat tasks until a particular job is completed
With the wide variety of software packages on the market that meet the needs of most users, it is probably not as important to learn a programming language as it once was for the average user.

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Sunday, 29 January 2012


One very simple way of thinking about how computers work is to make a comparison between a personal computer’s components and corresponding human functions:
  PC Components                               Human Functions
)    CPU)                                                       Brain
   Memory                                           Ability to recall things
Software                                                  Intelligence
Input (i.e.: keyboard)                            Five senses
Output (i.e.: printer)                            Speech, writing

Unlike humans, computers need to be told exactly what to do before they will perform any task. Computers also work only with binary (numeric) information. In order to perform, a computer first converts all programs and data into this numeric form that it can understand. Once this is done, it will perform any task assigned it. All of these activities are accomplished by the operating system. Essentially, the operating system is a group of programs that have the responsibility of telling the computer what to do, and the most efficient way of doing it. The operating system for IBM personal computers and IBM compatible systems is the Disk Operating System (DOS), which performs a number of jobs that can be broken down into three main functions:
1. Handles and manages files
2. Oversees the use of operating hardware
3. Interprets and executes commands (see Figure 15)
Memory and storage
While humans think of memory in terms of words or amounts of information that can be remembered, computers think of memory in terms of bits, bytes, K, and megabytes. The smallest unit of memory is the bit. A bit is a binary digit of information. Eight hits equal one byte, which is considered the standard unit for computing information. While the bit is too small a unit of information to spend much time worrying about, a byte can be thought of as one character, such as a letter, number, symbol, or space. For example, the phrase: “Tom, come here” contains 17 characters (4 symbols, 11 letters, and 2 spaces). This means the phrase also has 17 bytes. The symbol K represents 1000 bytes (1024 to be precise). A megabyte (Meg) contains one million bytes. One way to think about this is to compare it to a standard page of typed text that contains around 3000 characters, or 3K of information. Using this analogy, one megabyte would be roughly equal to 333 pages.

FIGURE 15. The disk operating system is an indispensable part of a personal computer, governing everything from file management to the hardware.
Obviously, the more memory capacity a personal computer has, the more it can accomplish. A 5.25-inch double-sided diskette, for example, can provide over 320K of long-term memory (or about 107 pages of standard text). (See Figure 16.) Many personal computers have hard disk memories, designed to hold 10 megabytes (see Figure 17) (about 3333 pages of equivalent text) or more. Diskettes and hard disks (sometimes called Winchester disks) allow for the long-term storage of programs or data. Unlike the

FIGURE 16. A diskette can hold the equivalent of 107 pages of text.
Internal working memory, the information stored in these outside sources isn’t lost when a computer is shut off. Both sources also offer mass storage capabilities that the computer’s internal memory isn’t large enough to deal with. Other types of storage devices include cassette tapes (which are rarely found in business systems), cartridge tape systems, which can be plugged directly into the personal computer or be kept as a separate piece of equipment, external hard disks (sometimes called expansion units) connected to the system but physicality separate from it, mass storage devices designed to support several systems, and hard disks that can be plugged directly into .1 personal computer (Figure 18) and then removed to be stored elsewhere.

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.


There are two types of memory stored within a system:
1. Read Only Memory (ROM)
2. Random Access Memory (RAM)
Read Only Memory is permanently imprinted in your computer when you get it. It is called ROM because it is meant to be read only by the computer itself. The person operating the computer has no control over it. The computer uses this memory to tell itself how to start up when you turn it on. Read Only Memory processes the electrical data flow from the keyboard to the CPU and from the CPU to the video display screen or to any other peripheral equipment you have attached, such as a printer. The decoders that translate numbers and characters into binary information that the computer understands are found in ROM. These programs are called firmware, or nonvolatile, because they are always there, and are not erased or destroyed when the power is turned off.
The other memory a system works with is called, Random Access Memory (RAM). This is what some people also call Read/ Write Memory, and is considered volatile because the information stored here is lost whenever the computer is shut off, or when power is lost. Random Access Memory is controlled by the person working with a system, and is where the instructions and information needed to get a job done are temporarily stored. Personal computer systems are often described by the amount of short- term (RAM) memory available (i.e., 64K, 256K, 640K). Most software programs also list the amount of memory they require to be stored in RAM. Most systems allow the user to increase the amount of memory available by installing additional single memory chips or expansion boards containing multiple chips. Even with these, however, there are finite limits to the amount of memory a personal computer has to work with. This is one reason why it is important to decide what kinds of work you would like to perform on a personal computer and the software most appropriate for accomplishing those tasks, before you purchase the system. This allows a computer to be configured to meet anticipated uses and eliminates the problem that a lot of people encounter of trying to install a software program that requires more memory than their system has available. In these situations the only recourse is to purchase and install more memory. A person using a computer comes in contact with all of its working parts. When a system is first turned on, the operating system takes over and makes sure everything under its control is functioning properly. Information is put into the system, where it is stored in memory according to an address code assigned by the computer, as seen in Figure 14. When the program (or list of instructions) is loaded, data are taken from the memory. Following the program instructions, the

FIGURE 14.  The flow of information processing. Illustration by Gina Bean. Data are then worked on in the Arithmetic Logic Unit (ALU). When all the instructions have been carried out, the information is returned to memory or to an output device.
Output can be delivered through any of several sources:
.Video Monitor (CRT)
.Voice synthesizer
.Modem for transferring information over telephone lines
.Storage device (diskette, hard disk, etc.)

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.



What goes on inside a computer is a mystery to most people, and even a little frightening to some. Actually, it is a fairly simple process - Whether we are talking about a supercomputer or one that can sit on top of a desk, all function in the same general way.
To understand how computers operate, we first have to break them down into several basic elements:
1. Input
2. Central processing unit (CPU)
3. Memory or storage
4. Output
Information is put into the computer through any of several different sources:
1. Keyboard
2. Data diskette (which can contain the program of instructions
What Are The Components Of A Personal Computer
The hardware components (those things you can see and touch), for a personal computer include:

A microprocessor (or system unit)—This is the Central Processing Unit (CPU) for a personal computer. While it looks like a box, it contains the memory systems (RAM and ROM) and is really the heart of the system. This unit also contains the disk drives. Since the memory is wiped clean each time you turn the machine off, you need a more permanent storage system. This is provided by keeping separate memory diskettes.

• A keyboard that lets you communicate with the system.

• A video display monitor (like a TV screen) that lets the system communicate with you.

• A printer that can produce a paper copy of whatever you are working on.

Diskettes—The software containing the programs you wish to run or on which to plan your work. A diskette is a small magnetic record that contains the storage space for your memory. When a diskette is inserted into a disk drive, it is spun much like a record on a turntable, and “read” electronically. A single sided diskette can hold the same amount of information as 110 pages of single-spaced text.

You will be working with, or the information you will be working on)
1. Cassette tape
2. Graphic tablets and electronic pens
3. Light pens, which can be used by directly touching the screen of a monitor

One key point to remember is that a computer will do exactly as it is told, and only what it is told. This can lead to what programmers call GIGO or Garbage In, Garbage Out. If the user doesn’t give the computer the correct information to work with, and precise’ instructions on what to do with that information, he or she will get back incorrect or meaningless answers and results.
Data is entered into the CPU. The CPU is where all the logical and control functions of a computer are carried out. The CPU is actually divided into two areas: a control unit, and the arithmetic and logic unit (ALU). The control unit spends its time figuring out what the computer is supposed to do next, and the ALU actually does it.
Memory is where information and instructions are stored. How does a computer memorize? A series of on/off switches lead an electrical current to a particular location, or address. Information is moved between the CPU and its memory banks by electronic pathways or conduits, called registers.

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Saturday, 28 January 2012


Thanks to Madison Avenue, Hollywood, and a horde of science fiction writers, some people have come to believe that computers have minds of their own and are capable of thinking for themselves. While there is some interest and developmental research in the field of artificial intelligence (the so-called fifth generation of computers, which would be able to learn from experience and improve their own performance on any given task), computers. We know them today are basically stupid.
This is an important point to remember, particularly when dealing with someone who has a fear of computers. Essentially, a computer can be thought of as a very fast, very large calculator that can manipulate or process a lot of information, under its own control. It will execute any command it is given with precision and speed, but won’t go beyond that point. In short, it will do exactly what it is told, and no more. It has no way of telling 4vliether the information it is working on is good or bad, unless it receives further instructions and is given some basis for comparison. The intelligence and control belong entirely to the person who is working with it. Turn it off and its memory can be wiped clean. It is important to think about computers as one of many tools (such as telephone, calculators, and electric typewriters and pencil sharpeners) that people have at their disposal to help make life a little easier. Like a calculator, a computer can add and subtract quickly and with a high degree of accuracy. When a person uses a calculator, however, a button has to be pushed for each function to be performed. A computer has the capacity to store a series of instructions that, in effect, tell it what buttons to push, and in what order to push them. Most of what computers can do is based on their ability to:
Add two numbers together
Subtract one number from another
Compare numbers or symbols to see if they are the same
The power computers possess comes from their ability to perform multiple functions simultaneously and process tremendous amounts of information in what amounts to the blink of an eye. They are at their best when used for large volume, highly defined tasks.
In order to function effectively; a computer requires:
An input device, so that information can be given to it Information (or data)
A program to tell it what to do, or how to work, with that data An output device so that it can display or print out whatever is requested of it
These computer concepts can be found at work in any number of things with which most of us have daily contact. For example:
Scanners such as that pictured in Figure 11, used in the checkout stands at the supermarket (including some that have voice synthesizers)

FIGURE 11. Computerized scanner at a grocery store. Photo by author.
Cash registers at fast-food and other restaurants that not only keep track of cash and sales but that also tie into inventory control and reordering. Automatic tellers programmed to transfer money from your account on demand, or perform other services (see Figure 12) Household appliances, such as microwave ovens and televisions. Automatic gasoline pumps (pictured in Figure 13), that record a purchase, turn on the pump and keep track of how many total gallons a station is using Automobile systems that calculate miles per gallon, trip times, and distance. Computers are able to do all these things because they make no distinction between numbers and symbols. Rather, they translate everything into electrical impulses, which form patterns that have meaning for the computer. These patterns form the basis of the computer’s numbering system by taking the electrical pulses and converting them to a binary system. Binary consists of exactly two numbers: 1 (a pulse of electricity) and 0 (no pulse). By stringing is and Os together, the computer converts whatever data it is given into terms it can understand. For example, the binary equivalent of the number 10 is 1010. Binary codes are also assigned to the characters on the keyboard, so that letters, symbols, and spaces are treated the same way numbers are. This is accomplished through an international conversion code called the American Standard Code for International Interchange (ASCU). Under this code for example, the letter “B” on a keyboard is given the numeric value 66, which the computer can convert to its binary equivalent of 01000010. When the computer is finished processing the information it is given, it translates everything back into numbers and symbols that we understand. For most of us, there is no reason to ever use binary in communicating with a computer because this is already on the software. The decoding instruction the computer needs to interpret everything is programmed into it by the manufacturer.

FIGURE 12. Computers help make breaking more convenient  through the automated tellers.

FIGURE 13. Computerized gas pumps calculate sales customer’s account. 

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Friday, 27 January 2012


Specially designed systems (usually several computers tied together). Used primarily in government or research. These are the most expensive and largest systems available, and they possess tremendous computing power. The cost of operating and maintaining them makes them prohibitive hit most organizations. One example of a supercomputer Is 11w one operated by the National Security Agency. Built at a cost 
 FIGURE 9. Cray Supercomputer shown here with designer Seymour Cray of Cray Research, Inc.
Of approximately $15 million, it is rumored to be capable of 150 to 200 million calculations per second, and has a memory capable of transferring  320 million words per second. This system is reported to be so powerful that the heat is generates would melt is down were it not for a specially designed cooling system.*
At present, there are some 150 supercomputers (similar to that in Figure 9), in operation around the world, with most located in the United States. The latest models have a memory capacity  some two billion bytes and processing speeds 40,000 to 50,000 times faster than a personal computer. Tasks that once took a year to accomplish on a second-generation computer can be done in about a second with a supercomputer.* Mainframes. These are the large machines that come to mind when most people think of computers. Costing hundreds of thousands of dollars, and requiring specially built facilities and large supporting staffs of operators, programmers, and analysts, they are designed to handle large volumes of work or complex calculations.
Minicomputer. Smaller than a mainframe and generally costing under $200,000, these systems are ideally suited for a medium- sized organization. They require smaller facilities and less staff than the mainframes, but have enough, power to process a wide range of commercial or scientific jobs.
Microcomputer. This is where the personal computer fits in. Designed to sit on the top of a desk, and within the financial reach of most organizations and many individuals, these systems represent the latest evolutionary stage. While not yet in the same league as their larger cousins, they can easily match or outperform the computing power of their first- and second- generation ancestors.
Lap-Top Computers. An offshoot of the personal computer, these small systems (many are complete with printer and liquid crystal displays) can offer the same type of power and functionality as a desktop model. Designed for portability, they can travel inside an attach case as seen in Figure 10, and can be used just about anywhere.
While personal computers can trace their lineage back several centuries (see Exhibit 5), they are a relatively new phenomenon. The personal computer revolution really got underway in 1969 with the invention of the Intel 4004 microprocessor, which contained 2250 transistors on a single microchip. At first, these were available only to large manufacturers, but in 1971 Intel decided to clear out its stocks by offering the 4004 microprocessor for
*Philip Elmer-Dewitt, “A Sleek, Super powered Machine,” Time (June 17, 1985). 53.

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Comparison of Computers Then and Now

Figure 8. Comparison of computers then and now. Courtesy of International Business Machines
This process has spanned three generations of growth:
The First Generation: The 1950s. Marked by the arrival of the UNI VAC, first-generation machines are identified by their use of electronic tubes. They were generally capable of executing about 1000 instructions per second and could store no more than 20,000 characters of information. It was during this time that Admiral Grace Hopper, a pioneer of the modern computer age, began what is generally considered the first career as a computer programmer. Hopper also pioneered the development of COBOL; perhaps the most common of all computer languages. The Second Generation: 1960 to 1965. First-generation computers were considered obsolete by 1960, as transistors replaced tubes. The second-generation computers were considerably smaller than their predecessors and handled in the range of one million instructions per second. The solid state technology ad these systems increased their storage capabilities and reliability, making them more attractive to business and industry. Computer concepts, such as operating systems, time sharing, and data communications, were -refined and gained a greater use. The Third Generation: 1965 to the Present. Advances in integrated and printed circuits have spawned the current generation of computers, which are smaller, faster, have more storage capacity, and are more affordable than ever before. There are, of course, many different types of computers available for modern use.

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.


                               Clearly, the machine no longer belonged to its makers.
  The Soul of a New Machine

Computers have been around a lot longer than most of us would like to believe. As a matter of fact, the computer’s lineage can be traced back to 1642 when Blaise Pascal, a French mathematical genius, invented the first real calculating machine. Pascal’s machine used a combination of rotating wheels and gears to perform simple problems of addition and subtraction.
In 1833 Charles Babbage, an English inventor, designed the great-grandfather of modern computers with the introduction of his analytical engine a forerunner of which is pictured below in Figure 3. The engine was composed of five parts:
(1) A calculating unit (the mill),
(2) The store (memory),
(3) An input device,
(4) A control section, and
(5) A printer, the system was driven by punched cards that fed the basic information into the engine, where it could be processed. Babbage also fathered the basic principles on which the first adding machine was constructed.
In 1842, Lady Augusta Ada Lovelace, a friend of Babbage, wrote the first computer documentation in her paper ‘‘Observations of Mr. Babbage Analytical Machine.' A mathematical prodigy,

FIGURE 3.  Babbage’s differential machine, a forerunner of his analytical engine, marked a major step towards the future development of computers. Smithsonian Institution photo number 53190.
Ada established herself as the world’s first computer programmer and provided the software for Babbage’s engine. In recognition of her contributions, the U.S. Department of Defense named its so-called super language after her and Ada became a registered trademark of the U.S. government. The 1840s saw the publication of several papers and theses by the English mathematician George Boole. Boole’s theories detailed how logical problems can be solved like algebraic equations. Boolean logic set the stage for the advent of computer science. In 1890, the first electronic calculating machine was invented Known as the Hollerith tabulator, it used punched cards for the first time. The United States used the Hollerith tabulator (Figure 4) to compute the census, and completed the job in a mere six weeks. Up to that time, it had taken as long as 10 years to prepare the census calculations. The era of modern computing began in 1925 at the Massachusetts Institute of Technology. There, a team of engineers led by Vannevar Bush developed a large-scale analog calculator since it was capable of storing number values electronically; this is considered the advent of-all that was to follow.

Figure 4 .Hollerith's tabulator provided a taste of future computing power when first used in figuring the results at the 1890 United States Census. Smithsonian Institution photo number 64563.
The 1930s and 1940s saw a number of advances in computer development, with the development of two of the more famous systems: ENIAC (electronic numerical integrator and computer) in the United States, and Colossus, the world’s first electronic computer, in England. Colossus was placed into operation to decipher the signals of Enigma, the German code machine. Colossus was credited with breaking Enigma’s code, which provided the necessary information to help the allies win the war. Colossus was SO secret that it was dismantled at the end of the war and only one piece is known to survive today. At the end of 1945 ENIAC arrived on the scene and solved its first problem in December of that year. The problem dealt with the hydrogen bomb, and is still considered a classified secret. The ENIAC, a portion of which is shown in Figure 5, was composed of 40 panels, each two feet wide and four feet deep, and housed some 18,000 vacuum tubes. It was capable of handling more than

FIGURE 5. ENIAC, one of the world’s first computers. Courtesy of International Business Machines.
One problem, although it had to be manually programmed by resetting switches, a process that could take up to two days.
Perhaps as a harbinger of things to come, ENIAC was obsolete almost as soon as it was running. A newer generation of stored program computers, which could be programmed electronically (instead of by recabling everything by hand), arrived in 1946 and quickly replaced ENIAC. For all its importance as one of the world’s first electronic computers, ENIAC had neither the power nor the speed of many of today’s hand-held calculators.
At that time, however, the sheer number of vacuum tubes needed to operate these early computers limited their use. Vacuum tubes were always burning out, so only short programs could be run. These machines literally filled entire rooms and were programmed at very low levels, often by a person setting and resetting row after row of switches and by recabling the system. Little wonder that a post-war government report saw little use for such machines and predicted that there might be a need for no more than three or four in the entire country. That might have been true, if the vacuum tube had remained the standard electronic core of a computer. The invention of the transistor in 1947 by Bell Laboratory scientists superseded the vacuum tube. The transistor was compact, used low voltages, and small amounts of power. It freed computers from the need be vacuum tubes and revolutionized the computer industry, setting the stage for today’s smaller computer systems. In 1951, the world’s first commercial computer, UNIVAC (Figure 6), was delivered to the Census Bureau. The UNIVAC set the trends for years to come and laid down standards that are followed even today. The original UNIVAC still blinks away at the Smithsonian Institute. Throughout the 1950s, 1960s, and 1970s, improvements in the construction of transistors opened new doors for computer man- manufacturing. The first transistors gave way to the integrated circuit, in which a number of transistors and the wiring that connects them were constructed in a single piece. Integrated circuits, turn, led to the development of wafer—thin silicon chips on which thousands of transistors can be packed into an area about one quarter of an inch square as in Fig7.

FIGURE 6. UNIVAC, the world’s first commercial computer. Smithsonian Institution photo number 72-2616.
The development of transistors and microchips led to the creation of bigger and more powerful computers. It also allowed smaller and cheaper machines to come into existence... In short, these developments led to the evolution of several distinct families of computers, as well as to a continuing decrease in the cost of computing power. In fact, since the mid-1970s, the cost of computing power has dropped by an average of 50 percent per year. A comparison of computing power then and now can be seen in Figure 8.

FIGURE 7 .  Line drawing of a microchip. Illustration by Gina Bean .

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Thursday, 26 January 2012


As we saw in the section on fear of computers, technology can have a profound effect on people. Fear, however, is not the only element that should be considered. People can also be affected by:
The physical demands placed on them by the technology.
The way that personal computers can impact job performance.
As more people spend increasing amounts of their day in front of video display terminals (VDT), more concern is being expressed for their physical well-being. This has increased interest in ergonomics, the study of the relationship between the human body and the machines we use.
One reason for the heightened interest is the growing concern over issues of health, comfort, and stress associated with the growing use of personal computers. Studies by several groups have concluded that there are a number of comfort issues related to the use of personal computers and computer terminals. Among the most cited problems is poor lighting and inadequate furniture that was often designed before anyone ever heard of office automation. According to a 1984 study released by the Administrative. Management Society Foundation, the most commonly reported problems associated with VDTs include pain in the shoulders, neck, buck, arms, and hands. Visual problems include burning eyes, headaches, focusing problems, and stress. These are brought about in part by poor lighting, problems of brightness contrast I twin characters and background, and flickering. Stress is also a factor that needs to be considered. Research landings indicate that some users show anxiety, depression, irritability, anger, confusion, and fatigue when working with terminals Jobs content can also play a factor. According to the Administrative Management Society study, clerical users complain more about discomfort than professional users. This seems to be a result of how personal computers fit into various jobs. For professional workers, they are often problem-solving tools, while data entry left to the clerks. In short, professional workers often use computers to perform tasks that result in something they can take pride in. Clerks, on the other hand, end up doing the simple repetitive jobs in which the end results are not as visible or meaningful. These are all issue that need to be conveyed to managers and owners, Training program can certainly help make the various levels of an organization aware  of the complexities of office automation tools, such as the personal computer, an can help educate people on the human needs in working with such a technology.

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Fear oF Damaging The Computer

When they first begin working with a personal computer, many people hardly touched the keyboard for fear of doing something that would cause damage to the computer. People who don’t think twice about slamming down a telephone receiver, or driving their car to its very limits, suddenly grow passive when confronted with a keyboard. From secretaries to corporate executives to the folks down on the loading dock, there is something about a computer keyboard that can turn each into a shrinking violet.
There are two basic reasons for this initial passivity
1. The perception that despite their power, computers are extremely fragile devices.
2. The feeling that computers are on a somewhat different plane than other types of tools, the "gift of the God" syndrome.
Let's take a closer look at each of these observations. The first is a result of the day-to-day contacts that most people have with computers and the limited knowledge they possess about them. From the average person’s vantage point, computers are like newborn infants that need special handling and protection. Most people never see the large mainframe systems where they work. These computers are typically locked away behind security doors, in guarded environments with their own air conditioning, heating, electrical, and humidity control systems. From the outside looking in, it would appear that computers require a lot of care and attention. This particular point of view is often strengthened by day-today experiences that may often be punctuated by periods when the computer isn’t available. In the jargon of data processing, people are told of down time, system failure, or crashes, without any idea what those things might really relate to. Small wonder that when they suddenly come face to face with a personal computer some people are somewhat reluctant to touch it. After all, if the big ones come tumbling down from time to time, despite the care and attention of experts, what will happen when they start touching one? The second observation is that some people view computers in a somewhat different light than they do other office tools, maybe because of the sheltered environment that most people associate with the large systems. Computers operate in an almost mystical realm. Movies and popular works of fiction have pictured them as extending human powers beyond those of the body and mind. We think computers can solve complex problems almost in the blink of an eye. What could take a human hours, days, years, or even decades to work through might be processed in a matter of seconds or minutes by a computer. A mystique has grown up around not only the systems themselves, but also around the people who work with them. In a society that is growing increasingly dependent on technology, many who lack education or insight into computers look on those who can make them work in much the same way that ancient cultures viewed their high priests. As computers have become increasingly insulated, their operations cloaked in jargon and acronyms foreign to most people, many ascribe a certain reverence and awe to everything associated with them. Computers, and those who run them, have come to occupy a special niche beyond the province of the average person. With the arrival of the personal computer, all this is suddenly changing. Now individual workers are being given access to the same power and magic previously associated with the large systems. For some, this sharing of the technology can be likened to the Greek gods descending from the mountain top to share their secrets with their mortal followers. Against these backgrounds, it’s easy to understand why many men and women are apprehensive when it comes to touching a personal computer for the first time. As computers are extended through organizations, it is important for people to see them in the same light as they do other fixtures of the office, such as telephones and copiers. Some of this will certainly occur over time and with increased usage, and can be facilitated through introductory training programs that emphasize or demonstrate the difficulty actually damaging a system. The message that should come across is that while a lot of things. Can occur to the information they are working with, simply banging away on the keyboard won’t do much to actually harm the computer itself.

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.


One chief advantage of personal computers is the improved productivity they bring to the applications development process. As anyone who has worked in, or with, data processing knows, the shortage of experienced programmers has led to large backlogs of computer applications. Simply put, people are thinking up more things for the computer to do than there are people to write the programs. The arrival of the personal computer extends the computing power necessary to achieve a lot of these applications and places it directly in the hands of those who are generating the requests,. What these people are looking for is instant productivity as a way to get around all those data processing backlogs. What some of them and their organizations are discovering is that there’s no such thing as a free lunch.
The problems fall into several categories:
People are creating programs without really thinking them through, and without giving consideration as to how their actions may be affecting others. There are a lot of duplicate programs being created, and an explosion in the number of private files and databases that are being created. People are not documenting their programs so that others can use them. Few pay attention to the need for backup and security. There is not much concern over maintaining programs once they have been created. Few people double-check their work to make sure they are doing the right things. Many- people believe that they can handle all their data processing needs simply by plugging some easy to use software into a personal computer and having at it. They tend to see personal computers as a way to avoid the long delays and other headaches of getting what they want from “those people up in data processing.” It is an unfortunate point of view to take, because it almost guarantees that they will have to learn the lessons of computers the same way all those folks in data processing did, by getting burned a few times. For example, a spreadsheet can be used to develop a budget or financial forecast with a fair degree of certainty that the columns and rows will be added correctly. Who checks to make sure that the right numbers were used, or the right formulas were applied? In data processing, people are taught to test the programs before they trust the results. A lot of computer users go with the first thing produced, or make last-minute adjustments just to see what effect they might have. The more complex the database becomes (calculate the commissions of all salespeople in the state, except those in . . .), the greater the probability of a mistake. This is particularly true of spreadsheets, which are almost seductive in nature. Information can look so good on a spreadsheet, and so authoritative, that people tend not to question it. After all, Computers rarely miscalculate anything. This same thought process contributes to other problems, such as not taking the time to prepare the documentation that tells others what the program Is and how it can be used. Employees who are trying to do nothing similar are left in a position of having to tie’ another program from scratch. It also means that when the author, leaves the organization there’s nothing to explain the program to his or her replacement. The problem is compounded as Joe creates something he’ thinks is great and shares it with Jane, who adds something and shares it with Pete, who modifies it for use with something developed by Pat. If all of this occurs without any controls or written guidelines or procedures, a major business failure could occur because of uncontrolled application development. This potential for disaster is enhanced because all too often people are not thinking about such issues as creating backups and security. In fact, most probably never will until they suffer a disaster. What data processing has learned over the past 20 years is about to be relearned by whole new groups of people. For some, learning is going to get expensive. The message won’t really be driven home, however, until someone spills coffee on the diskette containing the budget and discovers there isn’t another usable copy to be found. Maintaining programs once they have been created may also prove to be a bone of contention. A lot of people believe that what they are doing is unique, so they don’t give too much thought to what they are creating beyond producing one or two reports. A lot of people are writing what they believe are one- shot programs, and their organizations will end up living with them for years to come. Professional programmers and systems analysts learned long ago that the one-shot program that will only be used once and then forgotten doesn’t really exist. Once created, such things usually take on a life of their own. Someone has to maintain the program and the information it contains. Perhaps what data processing fears the most, and with some justification, is the creation of hundreds of new data centers throughout an organization. Since it took the data processing community some 20 years to realize the high costs and dangers of having duplicate files, duplicate data centers will remain a potential powder keg for personal computer users for some time to come. Private files and databases are being created every place a personal computer is available. Often the programs and files created exist only on a floppy disk laying on the user’s desk. No one else may know about it because there is no documentation, or they might not have access to it. If something happens to either the person or the diskette, well. One important point that everyone working with a personal computer will have to learn is that none of their actions occur in a vacuum. Everything they do has a potential consequence for someone else. In this regard, the lessons already learned by data.

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.


Many managers and executives who are working with personal computers are doing more of the work they used to delegate to their subordinates or secretaries. Higher ranking managers and executives are doing their own word processing, creating and maintaining mailing lists, assembling organization charts, developing budgets for their operating areas, and working on any Computing number of similar types of application programs and databases. Access to personal computers can actually create more work at higher levels in some organizations, among a group of people who traditionally complain about the constraints they feel on the use of their time. There doesn’t seem to be much doubt that personal computers can be powerful tools to help someone better manages their resources. It seems equally true; however, that they can also take over a person’s schedule and rob them of time. For example, the manager or executive who used to draft memorandums or letters in longhand, or by dictation, and then let a secretary set them up, now does all this work on a personal computer. Some say that it saves time, because they can redo or rearrange written information much more efficiently. Caught up in the magic of word processing, however, a few of these people are spending more time than ever setting up the perfect memorandum. One manager has admitted spending over an hour setting up a memorandum on his personal computer, which previously would have only taken 15 or 20 minutes to write and proof. Or, take the manager who spent several hours each day creating and updating her status files, and formatting (or reformatting) various types of reports, sometimes just to see how differently they could look. It didn’t take long for her staff to begin grumbling about her inaccessibility, and wondering what was going on behind her closed door. Then there was the executive who became so smitten by the possibilities of what a personal computer might do that she began ordering and working on new software packages at the rate of one every two or three weeks, to judge “its usefulness for my staff.” This was at a time when hers was the only personal computer in any of her work areas. In many cases, learning also takes place behind a closed door, as the manager or executive tries to learn how to use the technology on his or her own. This can be a frustratingly slow hunt and peck process for those who never learned to type, and who don’t want to be seen as not knowing something as rudimentary as a keyboard. Many of these people are investing large blocks of time working on teaching themselves or trying to develop program, time taken away from the management process.There are those who would argue that this isn’t necessarily hail, because managers need to know how to work personal computers in order to understand how they can be used to get work Personal computers in order to understand how they can be used to get work done. While that is certainly true, there are other ways to accomplish this without the managers investing their own time in actually performing the work itself. This is where training and management education can play an important role. The key question that needs to be addressed for management ranks and above is whether they should be trained on personal computers, or educated as to how they can be used. Managers and executives should be treated as a separate group. Any personal computers curriculum developed for business, industry, or 1oviinnwnt should include a seminar that addresses the issues III 1ersonal computers from the manager’s point of view, and that gives manages  the facts they need to make informed choices.
There are five primary areas of concern that any such seminar should  address:
1. What are the uses of a personal computer?
2. What does management  really need to know about computers
3. How do managers learn what they need to know?
4. How does the use of a personal computer fit into an individual's management or operating style?
5. What are the organizational issues and human ramifications of placing personal computers in the manager's work areas?
Personal computers have the potential for handling a great variety of tasks quickly and efficiently. They are emerging as a new force in (lie workplace, and their arrival foreshadows changes in the way information is shared, decisions are made, business l conducted, politics are played, and organizations are managed. Vet, they also need to be kept in their proper perspective. For this group of individuals, that means that personal computers should be viewed as extensions of the management process that need the same careful time and attention as any other resource under their control.

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.


No issue is as confusing, perplexing, or as potentially explosive as copying software. Since the inception of the Xerox machine our society has grown increasingly copy oriented. From making photocopies of books and magazines to taping record albums, live broadcasts, and movies, people have come to believe they have a right to reproduce things whenever they choose to. This same belief carries over to computer software, and personal computers make copying such materials an easy task. The copying of software is a gray area for Marty people and organizations, particularly for those just getting started. People sometimes confuse legal rulings that uphold the right to copy things in the public domain for private use with the right to copy and use software. The software industry itself adds to this uncertainty by pushing products that may or may not be copy-protected, and by the marketing of site licenses. The copyright law governing software is specific. Unless otherwise specified or agreed to by the developer or manufacturer, the purchaser is entitled to make one backup copy, or to copy the program onto a hard disk. While that might seem straightforward enough, illegal copying of software has developed into a problem of major proportions that is costing software manufacturers millions of dollars of lost revenues. The biggest culprits? Individual users in business, industry, and government. Copying software Is a fairly simple thing to do, particularly when using a dual disk-drive system. Just turn the system on, put the diskette containing the software to he copied in one drive, a blank formatted diskette in 11w other, and when the prompt appears (a>) type: ‘‘disk copy a: h:’’ and press the enter key. A few moments later you have an exact duplicate of the original. This is such an easy process that in half an hour one employee can probably make enough copies of a software package to meet the needs of 20 or 30 other people. In fact,, this is exactly what is happening in a lot of organizations. In many cases, employees are also making copies for their own private use. A study conducted by Future Computing and reported in the August 1965 Information Center Magazine suggests that there is one pirated copy of business software in use for every one authorized by the software developer. The study estimates that this cost manufacturers $1.3 billion in lost sales between 1981 and 1984. Other industry analysts believe this to be a conservative estimate, and set the rate of piracy considerably higher. Issues in Personal Computing, As might be expected, software companies are reacting strongly to this illegal use of their products, and rhetoric is giving way to action both in the courts and sometimes through the merchandise itself. In the latter case, some manufacturers are threatening to program “worms” into their software that would be activated if the original program diskette is copied more than once. When transferred to a pirated version, these worms randomly destroy whatever data they come into contact with. This is a very controversial step, and has drawn fire from many business and government quarters. These groups point out that a lot of things can happen to affect the original copy. It can, for example, be erased from a hard disk, copied over if stored on a diskette, or destroyed if the diskette isn’t properly handled. The prospect of not having a ready and reliable backup source doesn’t appeal to many of them. This leaves litigation as the most viable source of action, and many software companies are taking full advantage of the legal options afforded them.
For example:
Lotus Development Corporation sued the Rixon Corporation for $10 million in damages. Lotus charged Rixon with making at least 13 copies of their popular spreadsheet package and distributing them to branch offices. The case was settled out of court. Since this case, Lotus has brought suit against numerous other organizations, with several additional settlements. The Association for Data Processing Service Organizations (ADAPSO) brought suit on behalf of several software manufacturers against American Brands and Wilson Jones Company for unauthorized copying. The tough stance taken by Lotus and ADAPSO can be expected to spread throughout the industry, and could cost offending companies a lot of money if their employees get caught making illegal copies. Ignorance of such activities is not holding up well as a defense either, as several courts have held management responsible for illegal copies made by employees, even though the companies had no knowledge of their employees’ activities. A number of civil penalties can be imposed in these cases, including judgments for lost sales, royalties, or profits. Additional damages, as well as court costs and attorney’s fees, can be added on top of the original judgment, and some states have enacted fines and penalties that can also be imposed. Criminal penalties may also be imposed for those making illegal copies for profit.
 Essentially, the courts are being asked to answer two questions:
I. What rights, and responsibilities do users have?
2. To what extent do users; have to follow the terms of the licensing agreements that come with most software packages?
As software developers push their cases in the courts, the general trend seems to be holding for allowing only one backup copy. The landmark Supreme Court case of Sony Corporation of America vs. Universal City Studios (457 U.S. 116), is often held up as a defense against litigation brought by developers. In that 1984 ruling, the court held that copying television programs for private home use was legal because it made “fair use” of copyrights. Lower courts have tended to discount this in cases involving Computer Software, because companies that make copies for their own internal use are generally involved in profit making activities, which could have an adverse effect on the overall market of liar software. The other words, it wouldn’t be a fair use of the software program. This places companies in a rather awkward situation as more and more personal computers are brought into the American workplace. Increased numbers means a greater risk of increased copying. One Houston-based company discovered that 80 of 120 systems it owned had software installed that wasn’t authorized for use. Their internal audit uncovered some 18 different software packages that had been purchased, installed, and copied by employees. By and large, however, most of these suits and other actions have don't little to curl’ software piracy. This has led to another approach by some developers, the use of site licenses. Essentially, a site license authorizes a user to make as many copies of a particular software product as are needed in return for one large copyright payment.
Employee education about the copyright protections extended to computer software is generally conceded to be an important starting place in the fight against piracy, and should certainly be included in training programs for every level of employee.
Many companies also require employees to sign statements that they are aware of the copyright provisions for software, and promise not to violate them. While these statements have yet to be tested in the courts, their use has been cited as evidence that companies are becoming more responsive to the problem.

Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Related Posts Plugin for WordPress, Blogger...