RAM in review: how random-access memory works
Together with the processor and the storage, RAM (random-access memory) is a crucial element in the performance of a computer system. RAM is the working memory where data can be quickly and easily accessed by the CPU, allowing the operating system, software and processors to run.
High-performance, temporary workspace
You could think of RAM like your short-term memory – while your brain can store a huge amount of information, your mind concentrates on what’s immediately relevant to a specific task. Similarly, a PC or server can store a massive amount of data on hard disk or SSD, but it needs to load the relevant data into RAM to efficiently run an application. And like how recalling a colleague’s name comes easier than remembering your Year 8 Geography teacher, RAM chips are much faster than standard storage devices.
In terms of performance, it’s usually a case of the more RAM the better. The more RAM you have, the less reliant your CPU is on data in storage, meaning tasks can be executed faster.
Random and volatile
The ‘random-access’ part of RAM refers to how data can always be read or written directly, no matter where it’s physically located on the device, unlike conventional storage which may be limited by moving parts. Most RAM is classed as ‘volatile memory’, meaning data is only retained until the power is switched off. While your OS and files will be automatically loaded back into RAM the next time you turn on the computer, changes will only be permanent if they’re saved to storage.
DRAM and SRAM
The two main forms of RAM are dynamic (DRAM) and static (SRAM). DRAM is the more commonly used type, and needs to be constantly refreshed to retain data. Static random-access memory is a faster, more expensive form of RAM that’s often reserved for specialist roles, such as the cache memory on a processor.
Early RAM was asynchronous, meaning it didn’t run at the same frequency as the CPU. This became a problem when memory couldn’t keep up with the latest processors, but by the early 1990s the development of synchronised dynamic RAM (SDRAM) solved this issue.
Double data rate (DDR) RAM was developed around 2000. This technology boosted the rate of data transfer to twice the memory’s clock speed, and it was around this time that the meaning of SDRAM was modified to become single data rate RAM. Since then, we’ve seen four generations of DDR RAM, with the most widely used versions now being DDR3 and DDR4.
DDR3 vs DDR4
The most recent generations of DDR RAM, DDR3 and DDR4, came with performance enhancements and reduced power consumption. Compared to DDR3, DDR4 RAM offers higher data transfer rates and power efficiency improvements that make it ideal for servers and data centres, where the power consumption of hundreds of machines adds up fast.
While the difference in raw performance between DDR3 and DDR4 isn’t massive, choosing a PC or server with DDR4 is a great way to future-proof your system and ensure you’ll be able to continue running the latest applications as they become more and more demanding.
Some RAM comes with a specialist feature known as error-correcting code (ECC). ECC RAM uses an algorithm to detect and correct common errors which can result in crashes or data corruption. While frustrating, these errors aren’t usually a major issue for home users – just a case of turning it off and on again – but for a business, loss of data can be catastrophic. For this reason, ECC RAM is a must-have on servers handling valuable and sensitive data, such as customer information or financial details.
At Fasthosts, our range of dedicated servers come equipped with up to 128GB of DDR4 ECC RAM to guarantee the best possible performance and reliability for your online projects. If you’re not sure what server spec you need, just get in touch with our experts via the Fasthosts website – they’ll be happy to advise you.