Mein Portfolio open_in_new

Redis - A simple in memory caching system for blog posts with Redis


Imagine having a blog like this one. When serving a post, hitting the database on every visit seems inefficient, right? We will implement a simple caching system, which will reduce the amount of database queries a lot with Redis. Redis read & write operations are high performant, because it is a storage server that persists your data in the memory.

As this blog runs Laravel on the backend, we will cache Eloquent models in the RAM. But we also have to think about recognizing edited posts and fetching those from the database, since we won't get the data from the database anymore on every visit. Fortunately Redis can also set keys with an expiration date. This enables us to cache posts and when they expire a given time, they get fetched fresh and cached again. Nice, let's start.


We will need Redis installed on our server, which is pretty simple and tutorials for this can be easily found on the web.

On Linux the redis-cli can simply be installed on the local machine, too. If you're a windows user (like me) you can install it in an linux subsystem on windows 10.

Instructions - Install a Redis extension for Laravel

To use the Redis storage with Laravel there is an usefull plugin called predis. Let us require it with composer:

composer require predis/predis

Now we have to change the environment variables to get a connection to our redis-server. Just enter your data in your .env file.


Instructions - Configure the controller to cache data in Redis

Our backend is ready to use Redis and we can start implementing a caching algorithm. But first we need to add Redis as a dependency for our PostsController by adding:

use Illuminate\Support\Facades\Redis;

Then we will extend our showPost() method by adding a check for a key with the posts ID. If this check returns true, we will get the data from Redis, else we send a database request. Before sending the fetched data to the view, we create a redis key with the posts ID and cache the serialized model.

public function showPost($id)
  if ($post = Redis::get('post:' . $id)) {
    return view('blog-post')->withPost(unserialize($post));
  $post = Post::findOrFail($id);
  // store data into Redis for next 24 hours
  Redis::setex('post:' . $id, 60 * 60 * 24, serialize($post));
  return view('blog-post')->withPost($post);

Every time a user opens a post on our blog now, the controller searchs for a post model in the memory first, then he sends a request to the database only if there is none.

This simple optimization decreased the response time for about 60ms-500ms per post on one of my blogs and put a smile on my face.

Further improvements

After implementing this caching mechanism I recognized that a simple visit counter I implemented (by incrementing a visits column on the posts table) stopped working. We will dig into this problem in the next article.

Also it would be nice to keep control over which posts will get cached. I already have an idea for this, which I will try next.

Do you have any questions on my article? Is there anything I missed? What did you liked about the article? Please, tell me and leave a comment below.

Einen Kommentar posten

Kommentare zu diesem Beitrag

  • Keine Kommentare vorhanden