Dense Footprint Cache

[Boosts Application Speeds By 9.5 Percent]

Researchers from North Carolina State University and Samsung Electronics have found a way to boost the speed of computer applications by more than 9 percent.

Computers store all data to be manipulated off-chip in main memory (aka RAM), data required regularly by the processor is also temporarily stored in a die-stacked DRAM (Dynamic Random Access Memory) cache that allows the data to be retrieved more quickly.

This data is stored in large blocks, or macro-blocks, that allows the processor to locate the data it needs, but means additional, unwanted data contained in the macro-blocks is also retrieved, wasting time and energy.

By getting the cache to learn over time the specific data from each macro-block the processor requires, researchers from Samsung and NC State were able to improve the efficiency of data retrieval in a couple of ways.

Firstly, it speeds up data retrieval by allowing the cache to compress the macro-block so it only contains the relevant data and, secondly, the compressed macro-blocks free up space in the cache for other data the processor is more likely to need.

The researchers tested this approach, called Dense Footprint Cache, in a processor and memory simulator.

After running 3 billion instructions for each application tested through the simulator, the researchers found that the Dense Footprint Cache sped up applications by 9.5 percent compared to state-of-the-art competing methods for managing die-stacked DRAM. Dense Footprint Cache also used 4.3 percent less energy. It also significantly reduced the incidence of last-level cache (LLC) misses.

The team will present its paper (PDF) on DFC at the International Symposium on Memory Systems being held in Washington DC from Oct 3-6.

Article Details

Category : Softwares / Web
Posted on : 28 Sep 2016 @ 19:23

More In Softwares / Web

Comments

Copyright © TechnologyWOW All Right Reserved.
Privacy Policy | Terms | Disclaimer