compressed memory hierarchy dongrui she jianhua hui

25
Compressed Memory Hierarchy Dongrui SHE Jianhua HUI

Upload: kelley-watkins

Post on 24-Dec-2015

218 views

Category:

Documents


0 download

TRANSCRIPT

  • Slide 1
  • Compressed Memory Hierarchy Dongrui SHE Jianhua HUI
  • Slide 2
  • The research paper: A compressed memory hierarchy using an indirect index cache. By Erik G. Hallnor and Steven K. Reinhardt Advanced Computer Architecture Laboratory EECS Department University of Michigan
  • Slide 3
  • Outline Introduction Memory eXpansion Technology Cache-compression IIC & IIC-C Evaluation Summary
  • Slide 4
  • Introduction Memory capacity and Memory bandwidth The amount of cache cannot be increased without bound; Scarce resource: memory bandwidth;
  • Slide 5
  • Application of data compression First, adding a compressed main memory system (Memory Expansion Technology, MXT) Second, Storing compressed data in the cache, then data be transmitted in compressed form between main memory and cache
  • Slide 6
  • A key challenge Management of variable-sized data blocks: 128-byte Block After compression, 58 bytes unused
  • Slide 7
  • Outline Introduction Memory eXpansion Technology(MXT) Cache-compression IIC & IIC-C Evaluation Summary
  • Slide 8
  • Memory eXpansion Technology A server class system with hardware compressed main memory. Using LZSS compression algorithm. For most applications, two to one compression (2:l). Hardware compression of memory has a negligible performance penalty.
  • Slide 9
  • Hardware organization Sector translation table Each entry has 4 physical addr that each points to a 256B sector.
  • Slide 10
  • Outline Introduction Memory eXpansion Technology(MXT) Cache-compression IIC & IIC-C Evaluation Summary
  • Slide 11
  • Cache compression Most designs for power savings, using more conventional cache structures: unused storage benefits only by not consuming power. To use the space freed by compression, new cache structure is needed.
  • Slide 12
  • Outline Introduction Memory eXpansion Technology(MXT) Cache-compression IIC & IIC-C Evaluation Summary
  • Slide 13
  • Conventional Cache Structure Tag associated statically with a block When data is compressed
  • Slide 14
  • Solution: Indirect Index Cache A tag entry not associated with a particular data block A tag entry contains a pointer to data block
  • Slide 15
  • IIC structure The cache can be fully associative
  • Slide 16
  • Extend IIC to compressed data Tag contains multiple pointers to smaller data blocks
  • Slide 17
  • Software-managed Blocks grouped into prioritized pools based on frequency Victim is chosen from lowest-priority non- empty pool Generational Replacement
  • Slide 18
  • Additional Cost Compression/decompression engine More space for the tag entries Extra resource for replacement algorithm Area is roughly 13% larger
  • Slide 19
  • Outline Introduction Memory eXpansion Technology(MXT) Cache-compression IIC & IIC-C Evaluation Summary
  • Slide 20
  • Evaluation Method: SPEC CPU2000 Benchmarks: Main memory: 150 cycle latency, bus width 32, with MXT L1: 1 cycle latency, split 16KB, 4-way, 64B block size L2:12 cycle latency, unified 256KB, 8-way,128B block size L3:26 cycle latency, unified 1MB,8- way,128B block size, with IIC-C
  • Slide 21
  • Evaluation lsd Over 50% gain with only 10% area overhead
  • Slide 22
  • Evaluation
  • Slide 23
  • Summary Advantages: Increase Effective Capacity & Bandwidth; Power Saving From Less Memory Access Drawbacks: Increase Hardware Complexity Power Consumption of Additional Hardware
  • Slide 24
  • Future work Overall power consumption study Use it in embedded system
  • Slide 25
  • END Thank you ! Question time.