Memory consists of Random Access Memory(RAM) cells that can be accessed in any order. Memory cells are written onto a silicon chip in an array of columns (bitlines) and rows (wordlines). The intersection of a bitline and wordline is the address of the memory cell.
SDRAM: Synchronous dynamic random access memory takes advantage of the burst mode concept to greatly improve performance. It does this by staying on the row containing the requested bit and moving rapidly through the columns, reading each bit as it goes. The idea is that most of the time the data needed by the CPU will be in sequence. SDRAM is about five percent faster than EDO RAM and is the most common form in desktops today. Maximum transfer rate to L2 cache is approximately 528 MBps.
DDR SDRAM: Double data rate synchronous dynamic RAM is just like SDRAM except that is has higher bandwidth, meaning greater speed. Maximum transfer rate to L2 cache is approximately 1,064 MBps (for DDR SDRAM 133 MHZ).
RDRAM: Rambus dynamic random access memory is a radical departure from the previous DRAM architecture. Designed by Rambus, RDRAM uses a Rambus in-line memory module (RIMM), which is similar in size and pin configuration to a standard DIMM. What makes RDRAM so different is its use of a special high-speed data bus called the Rambus channel. RDRAM memory chips work in parallel to achieve a data rate of 800 MHz, or 1,600 MBps. Since they operate at such high speeds, they generate much more heat than other types of chips. To help dissipate the excess heat Rambus chips are fitted with a heat spreader, which looks like a long thin wafer. Just like there are smaller versions of DIMMs, there are also SO-RIMMs, designed for notebook computers.
Memory Caching: Caching is made to keep computer costs low while increasing efficiency. Cache is a small space that your computer will first search through before searching the entire hard drive. It is also extremely small in comparison to the actual hard drive. You may also have multiple levels of cache such as L1 and L2. L1 would be smallest and searched first while L2 is somewhat larger and searched after L1. Now it can be slower if your search for a rarely used file but generally cache increases the speed.