What is Cache? How to work and function
What is Cache? How to work and function
You've probably heard the word cache before, but don't know exactly what it means. The following is an explanation of What is Cache? How does it work, function and what do we need it for.
What is Cache?
Cache or CASH is hardware or software that is used to store something, usually data, temporarily in a computing environment, to improve the performance of data that has been accessed recently or frequently accessed and stored temporarily on quickly accessible storage media that is is local to the cache client and separate from mass storage. Caches are often used by cache clients, such as CPUs, applications, web browsers or operating systems (OS).
The cache is used because the mass, or primary, storage cannot fulfill the cache client request. Cache shortens data access time, reduces latency and improves input/output (I/O). Because almost all application workloads depend on I/O operations, and caching improves application performance.
How Caches Work
When a cache client needs to access data, it first checks the cache. When the requested data is found in cache, it is called cache hits. If the requested data is not found in the cache, a situation known as a cache miss, the data is pulled from main memory and copied into the cache. Cache data that is issued, it all depends on the caching algorithm or policy used by the system.
Web browsers, such as Firefox, Safari and Chrome, use browser caching to improve the performance of frequently accessed web pages. When you visit a web page, the requested file is stored in your computing storage in the browser cache. If you click back on the previous page it allows your browser to fetch most of the files it needs from the cache of that web page. This approach is called read cache. Browsers can read data from browser cache faster than rereading files from web pages.
* Least Frequently Used (LFU) tracks how often entries are accessed. Items that have the lowest number will be deleted first.
* Least Recently Used (LRU) places recently accessed items near the top of the cache. When the cache reaches its limit, the most recently accessed item will be deleted.
* MRU is used to delete the most recently accessed items first. This approach is best when older items are more likely to be used.
Why Use Cache
* Cache usage reduces latency for active data. This results in higher performance for a system or application.
* It also switches I/O to cache, reducing I/O operations to external storage and lower SAN traffic rates.
* Data can remain permanently on traditional storage or external storage arrays. It maintains data consistency and integrity using features provided by arrays, such as snapshots or replication.
* Flash is only used for the subset of workloads that would benefit from lower latency. This results in more expensive and cost-effective use of storage.
1. Cache Write-around writes operations to storage, skipping the cache altogether. This prevents the cache from being flooded when there is a large number of write I/O. The disadvantage of this approach is that data is not cached unless it is read from storage. That means read operations will be relatively slow because the data hasn't been cached.
2. Cache Write-through writes data to cache and storage. The advantage here is that because newly written data is always cached, it can be read on the fly. The drawback is that the write operation is not considered complete until data is written to the cache and main storage. This can cause cache bursts to introduce latency into write operations.
3. A write-back cache is similar to a write-through cache in that all write operations are directed to the cache. However, with cached replies, the write operation is considered complete once the data is cached. Then, data is copied from cache to storage.
With this approach, both read and write operations have low latency. The downside is that, depending on what caching mechanism is used, the data remains susceptible to loss until it is committed to storage.
Cache Usage Example
1. Cache servers
A dedicated network server or service that acts as a server or web server that stores web pages or other internet content locally. Cache servers are sometimes called proxy caches.
2. Disk cache
Holds recently read data and possibly contiguous areas of data that are likely to be accessed soon. Some disk caches cache data based on how often it's read. Storage blocks that are often read or known as hot blocks and are automatically sent to the cache.
3. Cache memory
Random access memory, or RAM, which is accessible by a microprocessor is faster than it can access regular RAM. Cache memory is often tied directly to the CPU and is used to cache frequently accessed instructions. RAM cache is much faster than disk-based cache, but cache memory is much faster than RAM cache because it's so close to the CPU.
4. Flash Cache
Temporary storage of data on NAND flash memory chips, often using solid-state drives (SSDs) to satisfy data requests more quickly than would be possible if the cache were on a traditional hard disk drive (HDD) or part of a support store.
So What Is Cache? Is a data storage technique that provides the ability to access data or files at a higher speed. Cache is implemented in both hardware and software. Caching serves as an intermediate component between the primary storage device and the receiving hardware or software to reduce latency in data access.
So many articles What is Cache? How to work and function. Look forward to other interesting articles and don't forget to share this article with your friends. Thank you…
Just an ordinary person who wants to share a little knowledge, hopefully the knowledge I provide can be useful for all of us. Keep in mind! Useful knowledge is an investment in the afterlife.