Memory Locality
Memory locality is a key principle that modern computer systems use to enhance performance, especially when it comes to caching. It refers to the tendency of a processor to access the same set of memory locations repetitively over a short period of time. There are two types of locality:
- Temporal Locality
- This refers to the reuse of specific data, and/or resources, within a relatively small time duration. In simpler terms, if a memory location is referenced then it will tend to be referenced again soon. This principle is used to justify the use of caches - if a data item has been used recently, keep it in the cache because there's a good chance we'll use it again soon.
- Spatial Locality
- This refers to the use of data elements within relatively close storage locations. In simpler terms, if a memory location is referenced then memory locations with nearby addresses will tend to be referenced soon. This is why, when a cache line is loaded from memory, it doesn't just load a single data item, it loads a contiguous block of memory around that item.
The concept of memory locality is used to predict and optimize memory behavior. Since accessing data from the cache is faster than accessing it from main memory, utilizing temporal and spatial locality can lead to significant performance improvements. By anticipating the data that is likely to be used in the near future and storing it in the cache, the system can minimize slower memory operations and increase overall speed.
temporal locality: 使用以前使用過的資料
spatial locality: 使用附近的資料