Computer: Difference between revisions
imported>Pat Palmer No edit summary |
imported>Pat Palmer No edit summary |
||
Line 1: | Line 1: | ||
During World War II, the first '''computer'''s (electronic machines that perform numerical calculations far faster than humans) were developed by the British and U. S. governments as a result of secret military projects. These first computers did not remain secret for long; they were adopted by private industry, and they quickly grew in usefulness while decreasing in size and cost. Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word '''computer''' may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities. | During World War II, the first '''computer'''s (electronic machines that perform numerical calculations far faster than humans) were developed by the British and U. S. governments as a result of secret military projects. These first computers did not remain secret for long; they were adopted by private industry, and they quickly grew in usefulness while decreasing in size and cost. Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word '''computer''' may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities. | ||
The [[history of computing]] is complex. The desire for computers had existed for a long time, but technology was net yet advanced enough to realize them. People had hankered after mechanical devices to help with mathematical calculations, inventing the [[abacus]], the [[slide rule]], and a host of mechanical [[adding machine]]s. But the electronic computer's rapid evolution forever changed [[science]], the [[military]] and [[business]]. The '''computer'''has vastly expanded human ability to store and share [[information]]; as such, its invention may be a milestone for humanity on a par with the advent of [[writing]] and materials to write on (millinea ago), or with the invention of the [[printing press]] (~1450). | The [[history of computing]] is complex. The desire for computers had existed for a long time, but technology was net yet advanced enough to realize them. People had hankered after mechanical devices to help with mathematical calculations, inventing the [[abacus]], the [[slide rule]], and a host of mechanical [[adding machine]]s. But the electronic computer's rapid evolution forever changed [[science]], the [[military]] and [[business]]. The '''computer''' has vastly expanded human ability to store and share [[information]]; as such, its invention may be a milestone for humanity on a par with the advent of [[writing]] and materials to write on (millinea ago), or with the invention of the [[printing press]] (~1450). | ||
==The nature of computing== | ==The nature of computing== |
Revision as of 16:19, 23 April 2007
During World War II, the first computers (electronic machines that perform numerical calculations far faster than humans) were developed by the British and U. S. governments as a result of secret military projects. These first computers did not remain secret for long; they were adopted by private industry, and they quickly grew in usefulness while decreasing in size and cost. Today, computers are ubiquitous household objects, perhaps unrecognized in the form of a tiny microprocessor embedded in a gadget such as a phone or a TV remote. Even defining the word computer may spark a debate, because so many different kinds of computers exist, and they are used for so many different kinds of activities.
The history of computing is complex. The desire for computers had existed for a long time, but technology was net yet advanced enough to realize them. People had hankered after mechanical devices to help with mathematical calculations, inventing the abacus, the slide rule, and a host of mechanical adding machines. But the electronic computer's rapid evolution forever changed science, the military and business. The computer has vastly expanded human ability to store and share information; as such, its invention may be a milestone for humanity on a par with the advent of writing and materials to write on (millinea ago), or with the invention of the printing press (~1450).
The nature of computing
Some people define a computer as a machine for manipulating data according to a list of instructions known as a program. However, this definition may only make sense to people who already know what a computer can do. Computers are extremely versatile. In fact, they are universal information-processing machines, but at the deepest level, what they really do is perform arithmetic. Today, most computers do arithmetic using the binary number system, because a binary number can be represented by an array of on-off switches, with each 0 or 1 digit, or bit, stored in one switch. In early electronic computers, the switches used for each digit were electromagnetic switches, also called relays. Later, vacuum tubes replaced electronic relays, and eventually transistors took the place of both relays and tubes. Transisters can now be manufactured as tiny devices, almost molecular in size, embedded within silicon chips. These tiny transistorized computers work on the same principles as the first, giant relay and vacuum tube based computers (which occupied entire buildings).
academic disciplines and professional societies
Computers and mathematics are closely related. Initially, mathematicians and scientists were the only users of computers. But today, what we tend to think of as a computer consists not only of the underlying hardware, with its limited instruction set that performs arithmetic, but also an operating system, which is a set of programs which allow people to use the computer more easily. The operating system is software (programs running on a computer). Without an operating system, a computer is not useful; the operating system helps people to write new programs for the computer and to perform many other activities on a computer. Since the early 1980's, universities started offering degrees in the academic disciplines such as computer science or computer engineering, devoted to the design of hardware and software for computers. These general fields of study soon came to consist of many sub-fields. In addition, most academic disciplines, and most businesses, use computers as tools.
Below are some of the professional and academic disciplines that teach the techniques to construct, program, and use computers. There is often overlap of functions and terminology across these categories:
- computer architecture (the study of how computers work, and how specific computers can be built)
- programming languages (specifications for how people ought to write computer programs)
- compilers (writing programs that allow people to use a programming language)
- computer engineering (a branch of electrical engineering that focuses both on hardware and operating system design)
- computer science (the academic study of computers and computation, including aspects of both theory and implementation)
- software engineering (management of the process of creating complex software systems)
- information systems (use of computer systems, usually in a business or organizational context)
- geographical information systems (combining latitude and longitude information with computer mapping programs)
Professional societies dedicated to computers include the British Computer Society, the Association for Computing Machinery (ACM) and the IEEE Computer Society.