I'll explain you a DIMM, or dual in-line memory module, comprises a series of random access memory
integrated circuits. These modules are mounted on a printed circuit board and designed for use in
personal computers. DIMMs began to replace SIMMs (single in-line memory modules) as the
predominant type of memory module as Intel's Pentium processors began to control the market.
The main difference between SIMMs and DIMMs is that SIMMs have a 32-bit data path, while DIMMs
have a 64-bit data path. Since Intel's Pentium has (as do several other processors) a 64-bit bus width, it
required SIMMs installed in matched pairs in order to use them. The processor would then access the two
SIMMs simultaneously. DIMMs were introduced to eliminate this inefficiency. Another difference is that
DIMMs have separate electrical contacts on each side of the module, while the contacts on SIMMs on
both sides are redundant.
The most common types of DIMMs are:
72-pin SO-DIMM (not the same as a 72-pin SIMM), used for FPM DRAM and EDO DRAM
100-pin DIMM, used for printer SDRAM
144-pin SO-DIMM, used for SDR SDRAM
168-pin DIMM, used for SDR SDRAM (less frequently for FPM/EDO DRAM in workstations/servers)
172-pin MicroDIMM, used for DDR2 SDRAM
184-pin DIMM, used for DDR SDRAM
200-pin SO-DIMM, used for DDR SDRAM and DDR2 SDRAM
214-pin MicroDIMM, used for DDR2 SDRAM
240-pin DIMM, used for DDR2 SDRAM, DDR3 SDRAM and FB-DIMM DRAM
There are 2 notches on the bottom edge of 168-pin-DIMMs, and the location of each notch determines a
particular feature of the module. usually it is 13cm for desktop version and 15cm for server version.
The first notch is DRAM key position. It represents RFU (reserved future use), registered, and
unbuffered.
The second notch is voltage key position. It represents 5.0V, 3.3V, and Reserved.
The upper DIMM in the photo is an unbuffered 3.3V 168-pin DIMM.
A DIMM's capacity and timing parameters may be identified with SPD (Serial Presence Detect), an
additional chip which contains information about the module type.
ECC DIMMs are those that have extra data bits which can be used by the system memory controller to
detect and correct errors. There are numerous ECC schemes, but perhaps the most common is Single
Error Correct, Double Error Detect (SECDED) which uses a 9th extra bit per byte.
At its most basic, DDR3 is the current standard for system memory, aka
RAM or, to get more specific, SDRAM. It's the fastest consumer RAM
currently in widespread use, and the type you're most often going to want
to buy (today, at any rate) if you want to upgrade your computer or if
you're planning to buildone from scratch. DDR3 has all but replaced the
older DDR and DDR2 in the marketplace, which is why these days
DIMMs using those earlier technologies can be somewhat difficult to find
and expensive to purchase.
But what exactly does the term "DDR3" mean? To understand that, you
need to understand its history.
SDRAM, or synchronous dynamic random access memory, was
developed in the early 1990s to solve a problem that began cropping up
as computers became more powerful. Traditional DRAM used an
asynchronous interface, which means it operated independently of the
processorwhich was not ideal if the memory couldn't keep up with all of
the requests the processor made of it. SDRAM streamlined this process
by synchronizing the memory's responses to control inputs with the
system bus, allowing it to queue up one process while waiting for
another. This way, computers couldexecute tasks much more quickly than
had previously been possible, and was the memory standard in computer
systems by the end of the 1990s.
It didn't take long after the introduction of SDRAM for hardware
developers and regular users to determine that even this route had its
limitations. The original SDRAM operated via a single data rate (or SDR)