name='keywords'/> COMPUTER BASICS FOR HUMAN RESOURCES PROFESSIONALS Best Blogger Tips

Since computer terminology can often be one of the biggest stumbling blocks to understanding the world of personal computers,I've tried to make things a bit easier by defining new terms at the beginning of the chapter in they first appear

Showing posts with label Program. Show all posts
Showing posts with label Program. Show all posts

Friday, 10 February 2012

Software


As noted earlier, there are software products for just about anything and everything imaginable. From games to programs designed to enhance professional development or monitor diet and nutrition, the list of possibilities seems almost limitless. Business software falls into the following categories:
Database management
Decision support
Word processing
Communications
Specialized use programs
Integrated software programs
DATABASE MANAGEMENT
Database management software allows for recording, modifying, and retrieving information without writing customized programs. Most database programs are designed to help users decide what type of information they want to record. They provide the means to enter, manipulate, and integrate the information to produce summaries, reports, or specific displays of items of interest. Most are set up to handle spur-of-the-moment inquiries, and allow records to be sorted and selected according to specific criteria. Examples of database software include PFS: File and Report and dbase III.
DECISION SUPPORT
The basic function of all decision support software, also known as financial modeling software, is to provide:
A quick and easy way to create mathematical models
Ways to enter information into the models
Reports of the results
These products can be broken down into three major categories: spreadsheets (VisiCa lc), financial modeling (IFPS), and integrated systems, which contain a database and graphing system (Lotus 1-2-3). All these products are designed to operate like an electronic multicolumnar accounting worksheet, except these worksheets have variable column widths with built-in math and financial models. Decision support software can be used to J)r1 pare budgets, analyze sales figures, calculate cash flow, or tiny other application where the information can be presented in a row or column.
WORD PROCESSING
Word processing packages allow for the creation, editing, and printing of documents, including correspondence and reports, form letters, legal papers, mailing labels, bills, and even book- length manuscripts. This book, for example, was written on a personal computer using one of the more popular word processing programs. The combination of word processing and personal computers offers more sophistication than can be found on a memory typewriter, and costs a fraction of the amount of a dedicated word processing system. The most fundamental component of any word processing package is its text editor. These editors are designed to edit a screenplay of text at a time, rather than line by line. They allow the user to scroll backward and forward through the text, rearrange it, copy it, delete words and phrases, add to an existing document, and, on some, check for spelling. Among the more popular programs are MultiMate, Word Star and the Volkswriter series.
COMMUNICATIONS
Personal computers that are linked together directly, or tied into host system, require the use of protocol software designed to allow two machines to talk with each other. A protocol is a standard that has been agreed to by hardware and software manufacturers so that different devices can transfer data between them. Without such a standard, two machines could not send or receive information at the same time or perform either function out of synchronization. A protocol provides a way for one machine to recognize that a line is tied up to the host, and that it must wait its turn. Protocols also provide the various sets of rules for controlling the transmission of information over any communications channel or cable.
Among the many functions protocol software programs perform are:
Establish and terminate the connection between two systems Maintain the integrity of the transmission through error detection procedures and requests for retransmissions Identify the sending and receiving machines handle a variety of special control functions, such as status checks, to make sure everything is working properly. Some software programs also provide a means to scramble and unscramble data communicated over telephone lines in order to protect the security of the information.
Among the more popular of these products are PC-Talk, Smart com II, and Crosstalk.
SPECIALIZED USE PROGRAMS
Specialized use packages fall into several groups:
Programs that support other software packages or enhance their use
Software that provides a specialized service, sometimes run- fling concurrently with another program
Graphics programs
Among the various packages available for specialized use are those that:
Run desktop organizers, which feature calculator, message board, telephone dialing, and appointment calendar functions that appear as windows similar to those shown in Figure 28, that overlay whatever software you may be working on. Print spreadsheet applications sideways so that they fit into reports more naturally. Increase the computer’s processing speed. Enhance keyboard operations by memorizing keystroke sequences and consolidating commands. With a program such as this, commands that normally take six or seven keystrokes can be reduced to one.

FIGURE 28. Monitor showing window display.
Produce presentation quality charts and graphs as shown in Figure 29.
Prepare line drawings, schematics, blueprints, and high resolution reproductions. The list of these specialized programs could go on. Most are available at a reasonable cost, usually for under $100, and many are offered free of charge through local personal computer user groups, or electronic bulletin boards.


INTEGRATED SOFTWARE
Integrated software programs offer several major functions in one package. Some of these offerings, for example, combine word processing, database management, and spreadsheets along with some form of communications. This is a one-stop-shopping approach, where you get everything in one place at one price. These are large programs that require substantial memory requirements on the order of 640K. Since they offer so much, they also have to scrimp in places to get everything in. This generally means that they can’t offer all the sophisticated features available in those programs that specialize in just doing word processing or spreadsheets. Two examples of integrated software are Symphony and Framework.


Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.






Wednesday, 1 February 2012

Records and files OF Programs


Computers store and process information in records and flies (see Figure 19). A record is a collection of related items that are stored in memory. A file is a collection of related records that are treated as a single unit. For example, you have sent a group of letters to a company called Jim’s Shoes and PC Emporium. Within the computer, the file becomes “Jim’s Shoes.” Each letter sent (and stored) represents one record of that file, as can be seen in Figure 20.
When you create a file, you may tall it almost anything you like, as long as it does not exceed eight characters in length. There


FIGURE 19. Computers manage data much the same as you handle files. Drawing courtesy of International Business Machines.






FIGURE 20. Drawing by Gina Bean.
are, however, certain characters that cannot be used in file names because the operating system reserves them to refer to system components. These include:
CON
AUX
COMI
LPT1
PRN
NUL


A complete list of all file names in either the program you are using, or the files you create, are available in a directory. A directory shows the file name, how much space (in bytes) it takes up. And the date it was created. Diskettes are capable of holding up 1e 112 files each.
When setting tip a file, try to make the file name as descriptive of a file's contents as the eight character limitation will allow (see Figiure28).
EXAMPLES OF FILE NAMES
Good                                                                                           Bad
Inventory                                                     Inventory (has more than eight characters)
Table                                                             (contains a space that isn’t allowed)


FIGURE 21. Courtesy of International Business Machines.
File name extensions
A file name extension is a brief three-character addition to a file name used to help identify or categorize the type of file it is. Extensions appear after the file name and are separated from them by a (.). One example might be book.fic, which would identify a series of sub files within the main file. In this case, the primary file identifies books, and the file name extension further breaks that down to works of fiction.


Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.






Programming Languages


As noted earlier, computers work only with binary information. This machine language is really nothing more than a series of Os and Is strung together. Machine language, however, is a difficult way for most people to communicate with a computer. To simplify things, a number of high-level programming languages were created. These are much simpler to use and understand because of their similarity to the everyday language of human communications. These programming languages are translated into a code the computer understands, using the firmware installed in it by the manufacturer.
Among the most widely used languages are:
FORTRAN: (Formula Translator). It is used primarily in medical, scientific, and technical applications.
COBOL: (Common Business Oriented Language). It is one of the most popular business and accounting languages.
BASIC: (Beginners All Purpose Symbolic Instruction Code). This is as close to English as current programming languages come. BASIC is supplied with many personal computers, by the manufacturers.
PASCAL: (Named for the Seventeenth-Century French Mathematician Blaise Pascal). This is a popular programming language with many microcomputer users, who consider it easy to learn.
No matter which language a program is written in, the languages generally share four fundamental activities. They:
1. Provide a method for getting something into the computer, and getting it out again
2. Make comparisons
3. Decide what activities need to be performed
4. Repeat tasks until a particular job is completed
With the wide variety of software packages on the market that meet the needs of most users, it is probably not as important to learn a programming language as it once was for the average user.



Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.


Monday, 30 January 2012

PROGRAMS


All the activities a computer performs are guided by a program. A program is  simply a set of instructions or commands that tells



 FIGURE 17. A personal computer with a 10 megabyte hard disk can store the equivalent of 3333 pages of text.








FIGURE 18. Three types of computer memory and storage devices. A microchip (64K). Diskette (320K) and hard disk (10 megabytes +). 







The computer how to perform a specific job. You may write your own programs using a programming language, or by purchasing a preprogrammed software package. Such packages will do spreadsheets, word processing, database management, communications, tax preparation, inventory, project management, games, household budgets, and hundreds of other tasks. While a person working with a personal computer might only make a single entry, like entering an amount into a checkbook register, the program that pulls everything together into something meaningful might actually have several steps. From the computer’s point of view, the program of instructions might look something like this:

I - Take the contents of memory location 5.
2. Add this to the contents of memory location 15.
3 Compare this to the contents of memory location 17.
4. If the contents are the same, store the results in memory location 21.
5. If the contents are not the same input a warning to the system monitors.
Programming languages
As noted earlier, computers work only with binary information. This machine language is really nothing more than a series of Os and Is strung together. Machine language, however, is a difficult way for most people to communicate with a computer. To simplify things, a number of high-level programming languages were created. These are much simpler to use and understand because of their similarity to the everyday language of human communications. These programming languages are translated into a code the computer understands, using the firmware installed in it by the manufacturer.
Among the most widely used languages are:
FORTRAN: (Formula Translator). It is used primarily in medical, scientific, and technical applications.
COBOL: (Common Business Oriented Language). It is one of the most popular business and accounting languages.
BASIC: (Beginners All Purpose Symbolic Instruction Code). This is as close to English as current programming languages come. BASIC is supplied with many personal computers, by the manufacturers.
PASCAL: (Named for the Seventeenth-Century French Mathematician Blaise Pascal). This is a popular programming language with many microcomputer users, who consider it easy to learn.
No matter which language a program is written in, the languages generally share four fundamental activities. They:
1. Provide a method for getting something into the computer, and getting it out again
2. Make comparisons
3. Decide what activities need to be performed
4. Repeat tasks until a particular job is completed
With the wide variety of software packages on the market that meet the needs of most users, it is probably not as important to learn a programming language as it once was for the average user.


Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.


Sunday, 29 January 2012

TYPES OF MEMORY STORED WITHIN A SYSTEM


There are two types of memory stored within a system:
1. Read Only Memory (ROM)
2. Random Access Memory (RAM)
Read Only Memory is permanently imprinted in your computer when you get it. It is called ROM because it is meant to be read only by the computer itself. The person operating the computer has no control over it. The computer uses this memory to tell itself how to start up when you turn it on. Read Only Memory processes the electrical data flow from the keyboard to the CPU and from the CPU to the video display screen or to any other peripheral equipment you have attached, such as a printer. The decoders that translate numbers and characters into binary information that the computer understands are found in ROM. These programs are called firmware, or nonvolatile, because they are always there, and are not erased or destroyed when the power is turned off.
The other memory a system works with is called, Random Access Memory (RAM). This is what some people also call Read/ Write Memory, and is considered volatile because the information stored here is lost whenever the computer is shut off, or when power is lost. Random Access Memory is controlled by the person working with a system, and is where the instructions and information needed to get a job done are temporarily stored. Personal computer systems are often described by the amount of short- term (RAM) memory available (i.e., 64K, 256K, 640K). Most software programs also list the amount of memory they require to be stored in RAM. Most systems allow the user to increase the amount of memory available by installing additional single memory chips or expansion boards containing multiple chips. Even with these, however, there are finite limits to the amount of memory a personal computer has to work with. This is one reason why it is important to decide what kinds of work you would like to perform on a personal computer and the software most appropriate for accomplishing those tasks, before you purchase the system. This allows a computer to be configured to meet anticipated uses and eliminates the problem that a lot of people encounter of trying to install a software program that requires more memory than their system has available. In these situations the only recourse is to purchase and install more memory. A person using a computer comes in contact with all of its working parts. When a system is first turned on, the operating system takes over and makes sure everything under its control is functioning properly. Information is put into the system, where it is stored in memory according to an address code assigned by the computer, as seen in Figure 14. When the program (or list of instructions) is loaded, data are taken from the memory. Following the program instructions, the

FIGURE 14.  The flow of information processing. Illustration by Gina Bean. Data are then worked on in the Arithmetic Logic Unit (ALU). When all the instructions have been carried out, the information is returned to memory or to an output device.
Output can be delivered through any of several sources:
.Video Monitor (CRT)
.Printer
.Voice synthesizer
.Modem for transferring information over telephone lines
.Storage device (diskette, hard disk, etc.)



Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.


 

Friday, 27 January 2012

THE WORLD OF PERSONAL COMPUTERS



                               Clearly, the machine no longer belonged to its makers.
            
 TRACY KIDDER,
                                                                                                            
  The Soul of a New Machine

Computers have been around a lot longer than most of us would like to believe. As a matter of fact, the computer’s lineage can be traced back to 1642 when Blaise Pascal, a French mathematical genius, invented the first real calculating machine. Pascal’s machine used a combination of rotating wheels and gears to perform simple problems of addition and subtraction.
In 1833 Charles Babbage, an English inventor, designed the great-grandfather of modern computers with the introduction of his analytical engine a forerunner of which is pictured below in Figure 3. The engine was composed of five parts:
(1) A calculating unit (the mill),
(2) The store (memory),
(3) An input device,
(4) A control section, and
(5) A printer, the system was driven by punched cards that fed the basic information into the engine, where it could be processed. Babbage also fathered the basic principles on which the first adding machine was constructed.
In 1842, Lady Augusta Ada Lovelace, a friend of Babbage, wrote the first computer documentation in her paper ‘‘Observations of Mr. Babbage Analytical Machine.' A mathematical prodigy,

FIGURE 3.  Babbage’s differential machine, a forerunner of his analytical engine, marked a major step towards the future development of computers. Smithsonian Institution photo number 53190.
Ada established herself as the world’s first computer programmer and provided the software for Babbage’s engine. In recognition of her contributions, the U.S. Department of Defense named its so-called super language after her and Ada became a registered trademark of the U.S. government. The 1840s saw the publication of several papers and theses by the English mathematician George Boole. Boole’s theories detailed how logical problems can be solved like algebraic equations. Boolean logic set the stage for the advent of computer science. In 1890, the first electronic calculating machine was invented Known as the Hollerith tabulator, it used punched cards for the first time. The United States used the Hollerith tabulator (Figure 4) to compute the census, and completed the job in a mere six weeks. Up to that time, it had taken as long as 10 years to prepare the census calculations. The era of modern computing began in 1925 at the Massachusetts Institute of Technology. There, a team of engineers led by Vannevar Bush developed a large-scale analog calculator since it was capable of storing number values electronically; this is considered the advent of-all that was to follow.

Figure 4 .Hollerith's tabulator provided a taste of future computing power when first used in figuring the results at the 1890 United States Census. Smithsonian Institution photo number 64563.
The 1930s and 1940s saw a number of advances in computer development, with the development of two of the more famous systems: ENIAC (electronic numerical integrator and computer) in the United States, and Colossus, the world’s first electronic computer, in England. Colossus was placed into operation to decipher the signals of Enigma, the German code machine. Colossus was credited with breaking Enigma’s code, which provided the necessary information to help the allies win the war. Colossus was SO secret that it was dismantled at the end of the war and only one piece is known to survive today. At the end of 1945 ENIAC arrived on the scene and solved its first problem in December of that year. The problem dealt with the hydrogen bomb, and is still considered a classified secret. The ENIAC, a portion of which is shown in Figure 5, was composed of 40 panels, each two feet wide and four feet deep, and housed some 18,000 vacuum tubes. It was capable of handling more than

FIGURE 5. ENIAC, one of the world’s first computers. Courtesy of International Business Machines.
One problem, although it had to be manually programmed by resetting switches, a process that could take up to two days.
Perhaps as a harbinger of things to come, ENIAC was obsolete almost as soon as it was running. A newer generation of stored program computers, which could be programmed electronically (instead of by recabling everything by hand), arrived in 1946 and quickly replaced ENIAC. For all its importance as one of the world’s first electronic computers, ENIAC had neither the power nor the speed of many of today’s hand-held calculators.
At that time, however, the sheer number of vacuum tubes needed to operate these early computers limited their use. Vacuum tubes were always burning out, so only short programs could be run. These machines literally filled entire rooms and were programmed at very low levels, often by a person setting and resetting row after row of switches and by recabling the system. Little wonder that a post-war government report saw little use for such machines and predicted that there might be a need for no more than three or four in the entire country. That might have been true, if the vacuum tube had remained the standard electronic core of a computer. The invention of the transistor in 1947 by Bell Laboratory scientists superseded the vacuum tube. The transistor was compact, used low voltages, and small amounts of power. It freed computers from the need be vacuum tubes and revolutionized the computer industry, setting the stage for today’s smaller computer systems. In 1951, the world’s first commercial computer, UNIVAC (Figure 6), was delivered to the Census Bureau. The UNIVAC set the trends for years to come and laid down standards that are followed even today. The original UNIVAC still blinks away at the Smithsonian Institute. Throughout the 1950s, 1960s, and 1970s, improvements in the construction of transistors opened new doors for computer man- manufacturing. The first transistors gave way to the integrated circuit, in which a number of transistors and the wiring that connects them were constructed in a single piece. Integrated circuits, turn, led to the development of wafer—thin silicon chips on which thousands of transistors can be packed into an area about one quarter of an inch square as in Fig7.


FIGURE 6. UNIVAC, the world’s first commercial computer. Smithsonian Institution photo number 72-2616.
The development of transistors and microchips led to the creation of bigger and more powerful computers. It also allowed smaller and cheaper machines to come into existence... In short, these developments led to the evolution of several distinct families of computers, as well as to a continuing decrease in the cost of computing power. In fact, since the mid-1970s, the cost of computing power has dropped by an average of 50 percent per year. A comparison of computing power then and now can be seen in Figure 8.



FIGURE 7 .  Line drawing of a microchip. Illustration by Gina Bean .




Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Thursday, 26 January 2012

THE SPREAD OF APPLICATIONS DEVELOPMENT

One chief advantage of personal computers is the improved productivity they bring to the applications development process. As anyone who has worked in, or with, data processing knows, the shortage of experienced programmers has led to large backlogs of computer applications. Simply put, people are thinking up more things for the computer to do than there are people to write the programs. The arrival of the personal computer extends the computing power necessary to achieve a lot of these applications and places it directly in the hands of those who are generating the requests,. What these people are looking for is instant productivity as a way to get around all those data processing backlogs. What some of them and their organizations are discovering is that there’s no such thing as a free lunch.
The problems fall into several categories:
People are creating programs without really thinking them through, and without giving consideration as to how their actions may be affecting others. There are a lot of duplicate programs being created, and an explosion in the number of private files and databases that are being created. People are not documenting their programs so that others can use them. Few pay attention to the need for backup and security. There is not much concern over maintaining programs once they have been created. Few people double-check their work to make sure they are doing the right things. Many- people believe that they can handle all their data processing needs simply by plugging some easy to use software into a personal computer and having at it. They tend to see personal computers as a way to avoid the long delays and other headaches of getting what they want from “those people up in data processing.” It is an unfortunate point of view to take, because it almost guarantees that they will have to learn the lessons of computers the same way all those folks in data processing did, by getting burned a few times. For example, a spreadsheet can be used to develop a budget or financial forecast with a fair degree of certainty that the columns and rows will be added correctly. Who checks to make sure that the right numbers were used, or the right formulas were applied? In data processing, people are taught to test the programs before they trust the results. A lot of computer users go with the first thing produced, or make last-minute adjustments just to see what effect they might have. The more complex the database becomes (calculate the commissions of all salespeople in the state, except those in . . .), the greater the probability of a mistake. This is particularly true of spreadsheets, which are almost seductive in nature. Information can look so good on a spreadsheet, and so authoritative, that people tend not to question it. After all, Computers rarely miscalculate anything. This same thought process contributes to other problems, such as not taking the time to prepare the documentation that tells others what the program Is and how it can be used. Employees who are trying to do nothing similar are left in a position of having to tie’ another program from scratch. It also means that when the author, leaves the organization there’s nothing to explain the program to his or her replacement. The problem is compounded as Joe creates something he’ thinks is great and shares it with Jane, who adds something and shares it with Pete, who modifies it for use with something developed by Pat. If all of this occurs without any controls or written guidelines or procedures, a major business failure could occur because of uncontrolled application development. This potential for disaster is enhanced because all too often people are not thinking about such issues as creating backups and security. In fact, most probably never will until they suffer a disaster. What data processing has learned over the past 20 years is about to be relearned by whole new groups of people. For some, learning is going to get expensive. The message won’t really be driven home, however, until someone spills coffee on the diskette containing the budget and discovers there isn’t another usable copy to be found. Maintaining programs once they have been created may also prove to be a bone of contention. A lot of people believe that what they are doing is unique, so they don’t give too much thought to what they are creating beyond producing one or two reports. A lot of people are writing what they believe are one- shot programs, and their organizations will end up living with them for years to come. Professional programmers and systems analysts learned long ago that the one-shot program that will only be used once and then forgotten doesn’t really exist. Once created, such things usually take on a life of their own. Someone has to maintain the program and the information it contains. Perhaps what data processing fears the most, and with some justification, is the creation of hundreds of new data centers throughout an organization. Since it took the data processing community some 20 years to realize the high costs and dangers of having duplicate files, duplicate data centers will remain a potential powder keg for personal computer users for some time to come. Private files and databases are being created every place a personal computer is available. Often the programs and files created exist only on a floppy disk laying on the user’s desk. No one else may know about it because there is no documentation, or they might not have access to it. If something happens to either the person or the diskette, well. One important point that everyone working with a personal computer will have to learn is that none of their actions occur in a vacuum. Everything they do has a potential consequence for someone else. In this regard, the lessons already learned by data.


Before we finish : If you need more help or have an opinion or suggestion Please leave a comment Below. This is a Do-Follow Blog leaving a comment will also help your blogs Google rank.

Related Posts Plugin for WordPress, Blogger...