RAM (random access memory)
What is RAM (random access memory)?
Random access memory (RAM) is the hardware in a computing device that provides temporary storage for the operating system (OS), software programs and any other data in current use so they're quickly available to the device's processor. RAM is often referred to as a computer's main memory, as opposed to the processor cache or other memory types.
Random access memory is considered part of a computer's primary memory. It is much faster to read from and write to than secondary storage, such as hard disk drives (HDDs), solid-state drives (SSDs) or optical drives. However, RAM is volatile; it retains data only as long as the computer is on. If power is lost, so is the data. When the computer is rebooted, the OS and other files must be reloaded into RAM, usually from an HDD or SSD.
How does RAM work?
The term random access, or direct access, as it applies to RAM is based on the facts that any storage location can be accessed directly via its memory address and that the access can be random. RAM is organized and controlled in a way that enables data to be stored and retrieved directly to and from specific locations. Other types of storage -- such as an HDD or CD-ROM -- can also be accessed directly and randomly, but the term random access isn't used to describe them.
Originally, the term random access memory was used to distinguish regular core memory from offline memory. Offline memory typically referred to magnetic tape from which a specific piece of data could be accessed only by locating the address sequentially, starting at the beginning of the tape.
RAM is similar in concept to a set of boxes organized into columns and rows, with each box holding either a 0 or a 1 (binary). Each box has a unique address that is determined by counting across the columns and down the rows. A set of RAM boxes is called an array, and each box is known as a cell.
To find a specific cell, the RAM controller sends the column and row address down a thin electrical line etched into the chip. Each row and column in a RAM array has its own address line. Any data that's read from the array is returned on a separate data line.
RAM is physically small and stored in microchips. The microchips are gathered into memory modules, which plug into slots in a computer's motherboard. A bus, or a set of electrical paths, is used to connect the motherboard slots to the processor.
RAM is also small in terms of the amount of data it can hold. A typical laptop computer might come with 8 GB or 16 GB of RAM, while a hard disk might hold 10 TB of data. A hard drive stores data on a magnetized surface that looks like a vinyl record. Alternatively, an SSD stores data in memory chips that, unlike RAM, are non-volatile. They don't require constant power and won't lose data if the power is turned off.
How much RAM do you need?
Most PCs enable users to add RAM modules up to a certain limit. Having more RAM in a computer cuts down on the number of times the processor must read data from the hard disk or solid-state drive, an operation that takes longer than reading data from RAM. RAM access times are in nanoseconds, while storage access times are in milliseconds.
Random access memory can hold only a limited amount of data, much less than secondary storage such as an SSD or HDD. If RAM fills up and additional data is needed, the system must free up space in RAM for the new data. This process might involve moving data temporarily to secondary storage, often by swapping or paging files. Such operations can significantly affect performance, which is why it's important that a system has enough RAM to support its workloads.
The amount of RAM needed depends on how the system is being used. When video editing, for example, it's recommended that a system have at least 16 GB RAM, though more is desirable. For image editing in Photoshop, Adobe recommends a system have at least 8 GB of RAM to run Photoshop Creative Cloud on a Mac. However, if the user is working with multiple applications at the same time, even 8 GB of RAM might not be enough and performance will suffer.
Types of RAM
RAM comes in two primary forms:
- Dynamic random access memory (DRAM). DRAM is typically used for a computer's main memory. As was previously noted, it needs continuous power to retain stored data. DRAM is cheaper than SRAM and offers a higher density, but it produces more heat, consumes more power and is not as fast as SRAM.
Each DRAM cell stores a positive or negative charge held in an electrical capacitor. This data must be constantly refreshed with an electronic charge every few milliseconds to compensate for leaks from the capacitor. A transistor serves as a gate, determining whether a capacitor's value can be read or written.
- Static random access memory (SRAM). This type of RAM is typically used for the system's high speed cache, such as L1 or L2. Like DRAM, SRAM also needs constant power to hold on to data, but it doesn't need to be continually refreshed the way DRAM does. SRAM is more expensive than DRAM and has a lower density, but it produces less heat, consumes less power and offers better performance.
In SRAM, instead of a capacitor holding the charge, the transistor acts as a switch, with one position serving as 1 and the other position as 0. Static RAM requires several transistors to retain one bit of data compared to dynamic RAM, which needs only one transistor per bit. This is why SRAM chips are much larger and more expensive than an equivalent amount of DRAM.
Because of the differences between SRAM and DRAM, SRAM is mainly used in small amounts, most notably as cache memory inside a computer's processor.
History of RAM: RAM vs. SDRAM
RAM was originally asynchronous because the RAM microchips had a different clock speed than the computer's processor. This was a problem as processors became more powerful and RAM couldn't keep up with the processor's requests for data.
In the early 1990s, clock speeds were synchronized with the introduction of synchronous dynamic RAM, or SDRAM. By synchronizing a computer's memory with the inputs from the processor, computers were able to execute tasks faster.
However, the original single data rate SDRAM (SDR SDRAM) reached its limit quickly. Around the year 2000, double data rate SDRAM (DDR SRAM) was introduced. DDR SRAM moved data twice in a single clock cycle, at the start and the end.
Since its introduction, DDR SDRAM has continued to evolve. The second generation was called DDR2, followed by DDR3 and DDR4, and finally DDR5, the latest generation. Each generation has brought improved data throughput speeds and reduced power use. However, generations were incompatible with earlier versions because data was being handled in larger batches.
GDDR SDRAM
Graphics DDR (GDDR) is another type of SDRAM that is used in graphics and video cards. Like DDR SDRAM, the technology enables data to be moved at various points in a CPU clock cycle. However, GDDR runs at higher voltages and has less strict timing than DDR SDRAM.
With parallel tasks, such as 2D and 3D video rendering, tight access times aren't as necessary, and GDDR can enable the higher speeds and memory bandwidth needed for graphics processing unit (GPU) performance.
Similar to DDR, GDDR has gone through several generations of development, with each version delivering greater performance and lower power consumption. GDDR7 is the latest generation of graphics memory.
RAM vs. virtual memory
A computer can run short on main memory, especially when running multiple programs simultaneously. Operating systems can compensate for physical memory shortfalls by creating virtual memory.
With virtual memory, the system temporarily transfers data from RAM to secondary storage and increases the virtual address space. This is accomplished by using active memory in RAM and inactive memory in the secondary storage to form a contiguous address space that can hold an application and its data.
With virtual memory, a system can load larger programs or multiple programs running at the same time, letting each operate as if it has infinite memory without having to add more RAM. Virtual memory can handle twice as many addresses as RAM. A program's instructions and data are initially stored at virtual addresses. When the program is executed, those addresses are translated to actual memory addresses.
One downside to virtual memory is that it can cause a computer to operate slowly because data must be mapped between the virtual and physical memory. With physical memory alone, programs work directly from RAM.
RAM vs. flash memory
Flash memory and RAM are both made up of solid-state chips. However, the two memory types play different roles in computer systems because of differences in how they're made and perform, as well as their cost.
Flash memory is used to store data. RAM receives the data from the flash SSD and provides it to the processor (via the cache). In this way, the processor has the data it needs much more quickly than if it were retrieving the data directly from the SSDs.
One significant difference between RAM and flash memory is that data must be erased from NAND flash memory in entire blocks. This makes it slower than RAM, where data can be erased in individual bits. However, NAND flash memory is less expensive than RAM and is non-volatile, which means it can hold data even when the power is off, unlike RAM.
RAM vs. ROM
Read-only memory, or ROM, is computer memory containing data that can only be read, not written to (except for the initial writing). ROM chips are often used to store startup code that runs each time a computer is turned on. The data generally can't be altered or reprogrammed.
The data in ROM is non-volatile, so it isn't lost when the computer power is turned off. As a result, ROM can be used for permanent data storage. RAM, on the other hand, can hold data only temporarily. A computer's ROM chip generally holds only several megabytes of storage, while RAM typically accommodates several gigabytes.
Trends and future directions
Resistive random access memory (RRAM or ReRAM) is non-volatile storage that can alter the resistance of the solid dielectric material of which it's composed. ReRAM devices contain a memristor in which the resistance varies when different voltages are applied.
ReRAM creates oxygen vacancies, which are physical defects in a layer of oxide material. These vacancies represent two values in a binary system, similar to a semiconductor's electrons and holes.
ReRAM has a higher switching speed compared to other non-volatile storage technologies, such as NAND flash. It also holds the promise of higher storage density and less power consumption than NAND flash. This makes ReRAM a good option for memory in sensors used for industrial, automotive and internet of things (IoT) applications.
Vendors struggled for years to develop the ReRAM technology and get chips into production. However, they've been making slow but steady progress, and several vendors are now shipping ReRAM devices.
At one point, the memory industry had placed a great deal of hope in storage-class memory (SCM) technologies such as 3D XPoint. 3D XPoint has a transistor-less, cross-point architecture in which selectors and memory cells are at the intersection of perpendicular wires. 3D XPoint isn't as fast as DRAM, but it's faster than NAND and provides non-volatile memory.
However, the only significant outcome of this effort was Intel's Optane product line, which included both SSDs and memory modules. The hope was that Optane could eventually fill the gap between dynamic RAM and NAND flash memory, serving as a bridge between them.
In terms of performance and price, Optane positioned itself somewhere between fast, but costly DRAM and slower, less expensive NAND flash. Unfortunately, the technology never took off and the company has discontinued its Optane developments efforts. The future of Optane and similar SCM technologies remains uncertain.
Boosting performance with LPDDR5
In February 2019, the JEDEC Solid State Technology Association published the JESD209-5, Low Power Double Data Rate 5 (LPDDR5) standard. LPDDR5 memory promised data rates up to 6400 mega transfers per second (MT/s), 50% higher than the first version of LPDDR4, which clocked in at 3200 MT/s.
In July 2019, Samsung Electronics began mass producing the industry's first 12 Gb LPDDR5 mobile DRAM. According to Samsung, the DRAM was optimized for enabling 5G and AI features in future smartphones. Since then, a number of other vendors have come out with LPDDR5 memory, with capacities now reaching 64 GB.
LPDDR5 promises to significantly boost memory speed and efficiency for a variety of applications, including mobile computing devices such as smartphones, tablets and ultra-thin notebooks, as well as high-end laptops such as MacBook Pro.
Cost of RAM
DRAM prices dropped significantly in early 2023, but that trend turned around by the end of the year, with prices continuing to climb. Earlier in the year, there had been an oversupply of DRAM, due in part to lower demand. In response, manufacturers started cutting production which began pushing the prices back up.
The market could, in fact, see a significant increase in prices in 2024, depending on production and inventory levels, as well as product demand. According to analyst firm TrendForce, the contract price for an 8 GB DDR5 memory module averaged about $17.50 at the end of November 2023, which was up 2.94% from the previous month. Whether the prices continue to climb or drop back down, the DRAM market remains as volatile as ever.
Further explore the differences between RAM and flash memory and cache versus RAM memory types.