A computer is a machine that is designed to perform a wide variety of information-processing tasks, depending on the program it has stored in its memory. It processes (computes) everything as mathematical problems, although many non-mathematical functions can be ultimately performed. Computers manipulate all data in the form of numbers, and encode all numbers in either binary code, which is base 2, or in the case of quantum computers, ternary code which is base 3. Binary means that there can only be a 1 or a 0 in any data. This makes it possible to process data electronically, since one voltage can be used to represent a zero and another voltage for a one. Ternary systems are far more complicated, and still in development. The average personal or server computer uses binary.
- 1 History
- 2 Hardware Classifications
- 3 Usage Classifications
- 4 Thin vs. fat clients
- 5 Architecture
- 6 Binary and ternary
- 7 See also
- 8 References
- 9 External links
Prior to the advent of computing machines, the definition for computer was a human being who performed complex mathematical calculations. This was accomplished with the aid of a manual counting device, such as an abacus or a slide rule, and was mainly performed by traders and early bankers to keep a reliable record of funds.
Charles Babbage, a British scientist who lived in the 19th century, has been credited as the designer of the first digital computer, the Difference Engine, a machine set to do calculations reliably up to six decimal places. It was entirely mechanical. However, the Engine was never constructed, being deemed as "had derived no emolument whatsoever from the government" by a member of parliament. He also designed an even more sophisticated "Analytical Engine", that would actually have been a programmable computer in the modern sense; it, too was mechanical. Though it was never built, Augusta Ada wrote some programs for it (to compute Bernoulli polynomials), making her the world's first computer programmer. A small version of the Difference Engine, was built after his death by his son.
Babbage later designed a simpler and even more clever "Difference Engine Number 2", which was also not completed in his lifetime. But in 1991, the bicentennial of Babbage's birth, this computer was built in the British Science Museum, from Babbage's original plans. It operates flawlessly, though a few billion times slower than modern electronic computers. It is operated by turning a crank.
Mainframes were the first kind of commercially used digital computers, and always used to be the fastest option. They had the ability to process much more information at a faster rate than any other kind. However, with the rapid advancement of technology in recent years, these have fallen behind somewhat. They consume an enormous amount of space, and use much more electricity than modern computers. In many cases, they have since been replaced with the client-server architecture. However, mainframes (mainly the IBM System Z) are still used by a number of companies. Although an older technology, they are still powerful, durable, and reliable systems. Some have been in fault-tolerant operation for at least forty years.
Manufacturers of mainframes included Burroughs, CDC, Univac, and IBM. IBM mainframes were the most commonly used mainframes in the world, and are still being manufactured today.
Most mainframes, including the IBM System Z, use the COBOL language (short for COmmon Business-Oriented Language). This is a very easy to understand programming language. For example, a line of this code is,
MULTIPLY HOURLY-RATE BY TOTAL-HOURS GIVING TOTAL-PAY. Although not used exclusively by mainframes, this is one of the main ways it is used. In 2009, 70-75% of the world's business transactions and 90% of global financial transactions were processed by COBOL. In 2016, it was estimated that at least 60% of business systems still used it, much of which was running on mainframes. Although now an outdated technology, mainframes are still a vital part of many systems.
For a more detailed treatment, see Supercomputer.
Supercomputers are the most powerful computers available, which have replaced mainframe computers. These computers, which generally fill large rooms like mainframes, are generally used by government agencies and large corporations. Since these computers are so expensive, they are reserved for important and complicated computations such as nuclear simulations and weather modeling. Quantum computers are also often considered supercomputers, due to their extensive capabilities (and size).
Minicomputers were smaller, cheaper cousins of the mainframe computer. They were also slower than mainframes. Over time, minicomputers became available in many configurations and sizes, up to the threshold of mainframes. In the 1970s, a typical minicomputer cost between $100,000 and $200,000 and could support up to 63 timeshared users. Larger minicomputers had large power requirements and often required special rooms with adequate air conditioning to remove the large amounts of heat that they generated. The first commercially successful minicomputer was the pdp/8 released by Digital Equipment Corporation in 1958. Unlike mainframes, the entire computer could fit on top of a desk. Minicomputers fell out of favor when powerful microcomputers became wildly available.
Microcomputers are computers that use microprocessors for the CPU. Some used the same instruction set as minicomputer CPUs, but implemented in one or two microprocessor chips, which allowed them to run the same software. In some cases, these computers could timeshare several users just as minicomputers. However, increasing CPU power and lower costs eventually led to the microcomputers being used by a single user at a time and being called "personal computers" (see below).
Workstations were typically single-user computers with specialized hardware to support a given application. One common use of workstations was for CAD/CAM (Computer-Aided Design/Computer-Aided Manufacturing), which allowed engineers to develop machines, parts, or buildings on the computer before physically constructing the items. This typically required high-end graphical hardware and monitors, large amounts of RAM, and CAD/CAM software unique to the given workstation. As a consequence, workstations were much more expensive than general-purpose single-user computers. With the advent of more powerful CPUs and their additional memory capacity, along with mass-market graphics cards (designed for games) that exceeded workstation graphics capabilities at far lower cost, workstations were largely supplanted by cheaper microcomputers. The specialized CAD/CAM software, however, still must be purchased. Sometimes the term "workstation" is loosely used to mean any personal computer dedicated to a specific task.
Personal computers are ones designed for use by a single user at any given time. These are the common desktop and laptop computers most people use today. Tablet computers could also be included in this general category. Some businesses will include person computers in a client-server network. In this system, the personal computers are used to connect with a main server computer (or in some cases, a server cluster).
Client-server is a means of connecting computers together (called a "topology"). It is a powerful approach that separates the "back end" computation onto a computer referred to as the server, and one or more client computers which present an interface for the user. In other words, a server provides services or data, while the client is the computer that requests those services from the server. In many cases, server computers host websites, send and receive e-mail messages, store information, and handle all other basic services offered on the Internet. Individuals connect to the internet using their computers which serve as the clients that use the services of those servers. Some servers perform less publicly accessible functions such as crawling websites and running any number of advanced calculations. Servers are also used by some companies to store records which are usually only available on their intranet (local network). For example, hospitals will often keep patient records on a secure server. Some computers can serve as both a server to other clients while at the same time being a client of another server.
Servers can also be clustered so that although they appear to be one server, a sometimes large quantity of servers are actually running. When multiple server computers are used, the storage, processing, and client capabilities of the whole increase dramatically. This fact has led the creation of many "server farms," which house little other than but thousands of servers.
Dedicated computers are, as the name suggests, dedicated to a specific function. These can be almost anything from Video game consoles to a child's handheld toy. These are generally limited in functionality, since they are designed for their single task only.
Thin vs. fat clients
"Fat clients" are client devices which can process data themselves, without requiring a remote server or mainframe. Personal computers, for example, are fat clients, since they process data on their own. "Thin clients" are user terminals which do little or no data processing themselves. These are much less common than they have been in the recent past. In older systems, client terminals were little more than links to a mainframe. The user would enter a command and set of data in the terminal, and it would be passed on the mainframe. The mainframe would then store, retrieve, or process the data as needed, and return anything needed to the terminal.
The newer client-server architecture has largely done away with this thin clients, but some networks still use this sort of system. Sometimes, there is no operating system or local storage at all on a client "computer." Everything done is downloaded from and uploaded to a central server. Most people now use "fat clients," which only occasionally need to communicate with a server.
Cloud computing has to some degree started bringing back thin clients, however. Since "Software as a Service" and other cloud-based solutions are becoming more popular, thick clients (personal computers) and to some degree being made into thinner clients again.
All computer systems, regardless of classification or usage, consist of the following components:
- Processing. Provided by one or more CPUs, sometimes with ancillary processors.
- Memory. High-speed local storage, such as Random-access memory
- Software. Each unique computer hardware environment needs software that works with the model of processor used in that computer.
- Input/Output. Computers are only useful if they can generate output. Output can vary from control lines for engine components to characters on a video screen. Input also varies in form, but most computers need input to process, which then drives the output.
Most modern computer systems also have:
- Storage. Large amounts of lower-speed, less-expensive hardware where data can be stored for later retrieval. Storage can be local or it can be remotely accessed via a network. The most typical form of local storage are hard disks, SSDs, thumb drives, and DVDROMs.
Binary and ternary
Binary has always been used by computers and is somewhat familiar to most. However, some of the newest supercomputers now use ternary systems.
For a more detailed treatment, see Binary system.
Since the beginning of the digital computer, and even before this (when computers were mechanical), computers have always used "ones" and "zeros" for every calculation and operation.
- The processor and circuitry operate using electrical pulses. A pulse is a "one" while a pause (no additional energy sent) is a "zero."
- The RAM switches are held in one position ("one") or released to the other ("zero")
- Magnetic disks are magnetized to positive or negative, which represent "zero" and "one"
- Optical disks have their foil melted by laser to show "zero" vs. "one"
- Flash memory has microscopic toggle switches which are set to "one" or "zero"
Every part of a binary computer uses these "ones" and "zeros" for everything.
For a more detailed treatment, see Quantum computer.
A new and evolving kind of computer breaks the "rules" of computer design entirely. Using properties of quantum mechanics, these computers make use of alternate states of matter which can only exist very near absolute zero. Using these, a quantum computer can not only use ones and zeros, but also a sort of "both one and zero." Ternary processors must be kept incredibly cold, however. Due to this requirement, as well as many other technical issues, ternary computers are not commercially available. For now, these remain restricted to laboratories where space and liquid nitrogen or liquid helium are plentiful. However, it is believed that they may eventually replace binary computers.
- Ada Lovelace
- Brain-computer interface
- IBM-HP-Dell and Microsoft Windows
- Apple Computer
- Apple iOS for iPhone-iPod-iPad
- Acorn computers
- ZX Spectrum
- The World Book Encyclopedia. 2001 ed. Vol. 4. Chicago: World Book, 2001. Print. Pages 908-94
- "mainframe." Encyclopaedia Britannica. Britannica Academic. Encyclopædia Britannica Inc., 2016. Web. 31 May. 2016. <http://0-academic.eb.com.www.consuls.org/EBchecked/topic/358715/mainframe>.
- Personal interview of an anonymous individual with User:DavidB4
- "Computer." Encyclopaedia Britannica. Britannica Academic. Encyclopædia Britannica Inc., 2016. Web. 30 Apr. 2016. <http://0-academic.eb.com.www.consuls.org/EBchecked/topic/130429/computer>.
- "client-server architecture." Encyclopaedia Britannica. Britannica Academic. Encyclopædia Britannica Inc., 2016. Web. 31 May. 2016. <http://0-academic.eb.com.www.consuls.org/EBchecked/topic/1366374/client-server-architecture>.
- "quantum computer." Encyclopaedia Britannica. Britannica Academic. Encyclopædia Britannica Inc., 2016. Web. 31 May. 2016. <http://0-academic.eb.com.www.consuls.org/EBchecked/topic/746092/quantum-computer>.