Explain the difference between Pipelining and Vector Processing
1. Pipeline is a set of data processing elements connected in series, so that the output of one element is the input of the next one where Vector processor(array processor) is a CPU design where the instruction set includes operations that can perform mathematical operations on multiple data elements simultaneously.
2. The elements of a pipeline are often executed in parallel or in time-sliced fashion means it can handle multiple elements where Vector processor can handles one element at a time using multiple instructions.
What is the difference of pipe lining and vector processing ?
A vector processor, or array processor, is a CPU design where the instruction set
includes operations that can perform mathematical operations on multiple data
elements simultaneously. This is in contrast to a scalar processor which handles one
element at a time using multiple instructions. The vast majority of CPUs are scalar (or
close to it). Vector processors were common in the scientific computing area, where
they formed the basis of most supercomputers through the 1980s and into the 1990s,
but general increases in performance and processor design saw the near
disappearance of the vector processor as a general-purpose
Today most commodity CPU designs include single instructions for some vector
processing on multiple (vectorised) data sets, typically known as SIMD (Single
Instruction, Multiple Data), common examples include SSE and AltiVec. Modern video
game consoles and consumer computer-graphics hardware rely heavily on vector
processing in their architecture. In 2000, IBM, Toshiba and Sony collaborated to create
the Cell processor, consisting of one scalar processor and eight vector processors, which
found use in the Sony PlayStation 3 among other applications.
Other CPU designs may include some multiple instructions for vector processing on
multiple (vectorised) data sets, typically known as MIMD (Multiple Instruction, Multiple
Data), such designs are very special and delicate breeds for dedicated purpose and these
are not commonly marketed for general purpose applications.
The more advanced approach is not the active multiplicity of instructions in parallel but
the active multiplicity in sequence, which led to the pipelining concept.
In software engineering, a pipeline consists of a chain of processing elements (processes,
threads, coroutines, etc.), arranged so that the output of each element is the input of the
next. Usually some amount of buffering is provided between consecutive elements. The
information that flows in these pipelines is often a stream of records, bytes or bits.
The concept is also called the pipes and filters design pattern. It was named by analogy
to a physical pipeline
Pipe lining and vector processing