storage_ram-differences_desktop
RAM (random access memory) is the place in a computing device where the operating system (OS), application programs and data in current use are kept so they can be quickly reached by the device’s processor. RAM is much faster to read from and write to than other kinds of storage in a computer, such as a hard disk drive (HDD), solid-state drive (SSD) or optical drive. Data remains in RAM as long as the computer is running. When the computer is turned off, RAM loses its data. When the computer is turned on again, the OS and other files are once again loaded into RAM, usually from an HDD or SSD.
You can compare RAM to a person’s short-term memory and a hard disk to long-term memory. Short-term memory focuses on the work at hand, but can only keep so many facts in view at one time. If short-term memory fills up, your brain is sometimes able to refresh it from facts stored in long-term memory. A computer also works this way. If RAM fills up, the processor needs to continually go to the hard disk to overlay old data in RAM with new, slowing the computer’s operation. Unlike a hard disk, which can become completely full of data and unable to accept any more, RAM never runs out of memory, but the combination of RAM and storage memory can be completely used up.
DRAM vs. SRAM
RAM comes in two primary forms:

Dynamic random access memory. DRAM is what makes up the typical computing device RAM and, as noted above, requires constant power to hold on to stored data.

Static random access memory. SRAM doesn’t need constant power to hold on to data, but the way the memory chips are made means they are much larger and thousands of times more expensive than an equivalent amount of DRAM. However, SRAM is significantly faster than DRAM. The price and speed differences mean SRAM is mainly used in small amounts as cache memory inside a device’s processor.

History of RAM
RAM is called random access because any storage location — also known as a memory address — can be accessed directly. Originally, the term distinguished regular core memory from offline memory, usually on magnetic tape in which an item of data could only be accessed by starting from the beginning of the tape and finding an address sequentially. RAM is organized and controlled in a way that enables data to be stored and retrieved directly to specific locations. Note that other forms of storage — such as the hard disk and CD-ROM — are also accessed directly or randomly, but the term random access is not applied to these forms of storage.

RAM (random access memory) is the place in a computing device where the operating system (OS), application programs and data in current use are kept so they can be quickly reached by the device’s processor. RAM is much faster to read from and write to than other kinds of storage in a computer, such as a hard disk drive (HDD), solid-state drive (SSD) or optical drive. Data remains in RAM as long as the computer is running. When the computer is turned off, RAM loses its data. When the computer is turned on again, the OS and other files are once again loaded into RAM, usually from an HDD or SSD.
Tiering vs. caching in storage systems
When it comes to storage, it seems fast is never fast enough. And although all-flash arrays are an attractive way to boost performance, they’re not a viable option for organizations with budget constraints. Luckily, storage I/O is improving through caching and tiering technology, but which is right for you?

You can compare RAM to a person’s short-term memory and a hard disk to long-term memory. Short-term memory focuses on the work at hand, but can only keep so many facts in view at one time. If short-term memory fills up, your brain is sometimes able to refresh it from facts stored in long-term memory. A computer also works this way. If RAM fills up, the processor needs to continually go to the hard disk to overlay old data in RAM with new, slowing the computer’s operation. Unlike a hard disk, which can become completely full of data and unable to accept any more, RAM never runs out of memory, but the combination of RAM and storage memory can be completely used up.

DRAM vs. SRAM
RAM comes in two primary forms:

Dynamic random access memory. DRAM is what makes up the typical computing device RAM and, as noted above, requires constant power to hold on to stored data.
Static random access memory. SRAM doesn’t need constant power to hold on to data, but the way the memory chips are made means they are much larger and thousands of times more expensive than an equivalent amount of DRAM. However, SRAM is significantly faster than DRAM. The price and speed differences mean SRAM is mainly used in small amounts as cache memory inside a device’s processor.
History of RAM
RAM is called random access because any storage location — also known as a memory address — can be accessed directly. Originally, the term distinguished regular core memory from offline memory, usually on magnetic tape in which an item of data could only be accessed by starting from the beginning of the tape and finding an address sequentially. RAM is organized and controlled in a way that enables data to be stored and retrieved directly to specific locations. Note that other forms of storage — such as the hard disk and CD-ROM — are also accessed directly or randomly, but the term random access is not applied to these forms of storage.

RAM started out as asynchronous, or having a different clock speed for the microchips in the RAM than the processor. This was a problem as processors became more powerful and RAM couldn’t keep up with requests for data from the processor. In the early 1990s, clock speeds were synchronized with the introduction of synchronous dynamic random access memory. SDRAM reached its limit quickly, since it transferred data in a single data rate. Around the year 2000, double data rate random access memory (DDR RAM) was developed. This moved data twice in a single clock cycle — at the start and end. The introduction of DDR RAM also seems to have changed the definition of SDRAM, as many sources now define it as single data rate RAM.

DDR RAM has evolved three times, through DDR2, DDR3 and DDR4. Each iteration improved data throughput speeds and reduced power use. However, each version is not compatible with the previous ones, as data is handled in larger batches in each innovation.

How big is RAM?
RAM is small, both in physical size — it’s stored in microchips — and in the amount of data it can hold. A typical laptop computer may come with 4 gigabytes of RAM, while a hard disk can hold 10 terabytes.

Advertisements