Home Cloud Management Introduction to Redis and Caching

Introduction to Redis and Caching


Redis stands for Remote Dictionary Server.

According to Redis official website, “Redis is an open-source, in-memory data structure store, used as a database, cache and message broker. It supports data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes with radius queries and streams.”

Redis stores data by leveraging a key-value system; which makes it effortless to retrieve data since no complicated operations are slowing down relational databases.

Salvatore Sanfilippo

Benefits of using Redis

  1. It is blazingly fast. The reason is, it is written in C.
  2. Redis holds its database entirely in the memory, using the disk only for persistence.
  3. If you are a developer, it natively supports most of the datatypes that most developers already know like hashes, set, list, sorted sets, etc.
  4. The biggest benefit is its use cases. It is a multi-utility tool and can be used in several use cases like caching, chat, messaging-queues, session store, real-time analytics, etc.
  5. It offers data replication. Replication is the process of setting up master-slave cache nodes. Redis can replicate data to any number of slaves.

The most popular use case of Redis is Caching.

What is Caching?

The cache is temporary storage where data is stored so that in the future, data can be accessed faster. So, caching is the process of storing data in Cache.

How Caching Works?

The images below are just for illustration purposes.

Without Caching
Illustration 1

With Caching
Illustration 2

In the first illustration, the server queries the database every time a request for profile information on application Z comes from Mr. User.

Let’s take into consideration that Mr. User requests this data around 20 times during his browsing session, and each request takes 5 seconds to complete. The response time for every request remains constant. Then, 5 sec* 20 requests = 1 min. 40 sec.

In the second illustration, the server looks into cache whenever Mr. User requests profile information and query the database only if the data is not available in the cache (here Redis).

For the second illustration, consider that the request-response cycle takes 2.5 seconds, after the first call, and Mr. User requests this data 20 times. So,

5 sec* 1 request               = 5 seconds (Initial Requests)

2.5 sec * 19 requests        = 47.5 seconds (Subsequent Requests)

Total                                = 52.5 seconds

The time difference between the second and the first is almost half.

The noticeable advantage of Redis cache is that it helps in accessing data at a faster rate if it has been requested before. It is not only about speed.

Leveraging Redis cache is a sensible and cost-saving approach because querying the database is an expensive operation.

Join the most exclusive cloud community to network with experts, evangelists, and industry veterans here.

Cloud Evangelist
Cloud Evangelist
Cloud Evangelists are CMI's in house ambassadors for the entire Cloud ecosystem. They are responsible for propagating the doctrine of cloud computing and help community members make informed decisions.


Cloud Management