Caching is the art of remembering things so you don't have to go look them up again. It trades memory (RAM) for speed (Latency).
Databases store data on disks (Hard Drives). Reading from a disk is slow. Network travel is even slower. A typical DB query takes 50ms - 200ms.
We put a "Cache" (like Redis or Memcached) in front of the DB. Caches store data in RAM (Memory). Reading from RAM takes < 1ms.
Experiment: Request "User Profile ID: 101".
1st time: The Cache is empty (Miss). 2nd time: The Cache has it (Hit).
Your cache has limited space (RAM is expensive). What do you delete when it gets full?
Delete the item that hasn't been used for the longest time.
"Nobody asked for 'Page 5' in a week? Throw it out."
Give every item an expiration date (e.g., 5 minutes). When the time is up, it self-destructs. Ensures data doesn't get too stale.
Delete the item that is used the least often overall, even if it was used recently.