Recall that the core concept of distributed computing is splitting up the workload between more
than one computer. Its sister concept, parallel processing or computing, takes this concept in a
the power in more processors
Fig. 1: Distributed Computing vs. Parallel Processing
Parallel processing, or computing, takes the brains of many computers and stuffs many of them
into one. With more than one processor, the computer can process tasks much faster than one can.
If parallel processing is hard to imagine, think of the adage, "Many hands make light work."
Here, many processors divvy up a problem, lightening the workload on each other.
Fig. 2: Multiple Resonance Imaging (MRI) scanner
Like distributed computing, parallel processing has many applications in
today's world. Any problem that requires numerous computations perfectly suits parallel
processing. Hospitals use these parallel machines to analyze images from MRI scanners, similar to the one in Figure 2. Airlines use
them to process customer information and keep track of airplane flights. Engineers use them to
construct the safest possible guardrail on the highway by testing different variables, including
vehicle type and the metal of the rail (Mitchell). Aircraft designers, physicists,
meteorologists, multimedia banks, warehouses that need to track inventory — practically anything
in the scientific, technical, or business field benefits from parallel processing. (Shankland)
processing has the edge over wide-range distributed computing because of the close proximity of
the processors. Because of this, they are able to communicate with each other readily, important in
calculations which require multiple calculations and dependent variables.
Despite this advantage in communication, parallel processing has its share of disadvantages. Its
software is heavily platform-dependent and has to be written for a specific machine. It also
requires a different, more difficult method of programming, since the software needs to
appropriately, through algorithms, divide the work across each processor (Pountain). Because of
this, there isn't a wide array of shrink-wrapped software ready for use with parallel machines.
Clustering is a form of parallel processing that
takes a group of workstations connected together in a local-area network and applies middleware
to make them act like a parallel machine. Because this method can be used at night when
networks are idle, it is an inexpensive alternative to parallel-processing machines. (Pountain)
Clustering can work with two separate but similar implementations. A Parallel Virtual Machine,
or PVM, is an environment that allows mesages to pass between computers as it would in an
actual parallel machine. A Message-Passing Interface, or MPI, allows programmers to create
message-passing parallel applications, using parallel input/output functions and dynamic process
As with other networking architectures, clustering requires the use of a variety of softwares
working together. One of the most popular software combinations is Beowulf, which utilizes the
Parallel Virtual Machine environment and runs on the free open-source Linux kernel (open-source
means that the inner workings of the program are available freely, as opposed to closed-source
software, like Windows, whose inner workings cannot be seen).
Clustering shares parallel processing's disadvantages with its difficulty in programming, and the
lack of a single solution, especially for Beowulf — they require the use of several different
programs. However, Compaq and IBM are resolving these shortcomings with packages of
components and management tools (Shankland).