What Is Varnish Cache and How Can It Benefit Your Server?
Learn how Varnish Cache can significantly speed up your website and the proper ways to set it up. With Varnish, you can reduce server load, decrease page load times, and improve user experience.
Varnish Cache is an open source HTTP accelerator that is designed to improve the performance of busy, dynamic websites. By caching frequently accessed content in memory, Varnish can serve requests much faster than a traditional web server.
If you're running a website that experiences a lot of traffic, Varnish can help reduce server load and improve response times for your users. Let's take a closer look at how it works and what benefits it can provide.
How Does Varnish Cache Work?
Varnish Cache sits between the client and the web server, acting as a reverse proxy. When a user requests content from your website, Varnish intercepts the request and checks if it has a cached copy of the content. If it does, Varnish serves the cached content directly to the user. If not, Varnish forwards the request to the web server to generate a new response, which it then caches for future requests.
Varnish Cache is able to cache content at the edge of the network, close to the user, which can reduce latency and improve performance. Varnish also supports advanced caching features, such as content compression and cache invalidation, which allow you to control how long content is cached and when it should be refreshed.
Benefits of Using Varnish Cache
Varnish Cache helps improve website performance in several ways:
- Less strain on your server: Because Varnish Cache can serve cached content directly to the user, it can significantly reduce the load on your web server. This can help you avoid server crashes and ensure that your site stays online even during periods of high traffic.
- Faster page loads: Varnish Cache can serve up cached content quickly, so your visitors don't have to wait as long for pages to load. This creates a better user experience and keeps people on your site longer.
- Handles traffic spikes: If your website suddenly gets a lot of traffic, Varnish Cache can help by serving up cached content. This takes some of the pressure off your server, so it can handle the extra traffic more easily.
- Caches dynamic content: Varnish Cache isn't just for static content. It can also cache dynamic content, such as pages that are generated on the fly. This can improve page load times and reduce the load on your server.
- Flexible caching options: Varnish Cache lets you customize your caching rules based on things like URL patterns, HTTP headers, and user agents. This gives you more control over how your website is cached.
- Improved security: Varnish Cache acts as a reverse proxy, which can help protect your server from direct attacks. This is an extra layer of security that can give you peace of mind.
- Real-time metrics and logging: Varnish Cache comes with tools to help you monitor cache performance, including real-time metrics and logging. This allows you to troubleshoot issues and optimize your caching strategy as needed.
Installing and Configuring Varnish Cache on Ubuntu
The first step to installing Varnish Cache on Ubuntu is to add the Varnish Cache repository to your system. First, Add the Varnish Cache repository to your system
$ curl -L https://packagecloud.io/varnishcache/varnish70/gpgkey | sudo apt-key add -
$ echo "deb https://packagecloud.io/varnishcache/varnish70/ubuntu/ $(lsb_release -sc) main" | sudo tee /etc/apt/sources.list.d/varnishcache.list
Now, Update your package lists:
$ sudo apt update
Install Varnish:
$ sudo apt install varnish
Once you have installed Varnish Cache, you need to configure it to work with your web server.
Configuring Varnish Cache
Varnish Cache configuration is done using the Varnish Configuration Language (VCL), which defines how Varnish handles incoming requests and what to do with them.
The default VCL file is located at /etc/varnish/default.vcl
. You can modify this file to suit your needs, or create a new VCL file and specify it when starting Varnish.
This configuration file helps Varnish understand how to handle incoming requests and manage its cache. The file has a few different sections, each with a specific purpose.
- The first section, called
vcl_recv
, determines how Varnish should deal with incoming requests. It usually includes rules to remove requests that Varnish should not cache, such as administrative pages or requests with cookies. - The second section, called
vcl_backend_fetch
, decides how Varnish retrieves content from the backend server. This section can be used to modify the request headers or to configure how Varnish communicates with the backend server. - The third section, called
vcl_backend_response
, controls how Varnish handles responses from the backend server. It can be used to modify response headers, cache certain types of content, or set TTL (Time To Live) values to determine how long content should be cached. - The fourth section, called
vcl_deliver
, controls how Varnish sends the response to the client. It can be used to modify response headers, add cookies or other information to the response, or log information about the request. - The last section, called
sub vcl_hash
, determines how Varnish creates a unique identifier for each request. This helps ensure that identical requests receive the same cached response.
Here's an example of a basic VCL configuration file that caches all GET requests for 1 hour:
vcl 4.0;
backend default {
.host = "127.0.0.1";
.port = "8080";
}
sub vcl_recv {
if (req.method == "GET") {
set req.ttl = 1h;
}
}
In this example, we've defined the backend server to be located at 127.0.0.1:8080
, and set the TTL for all GET requests to 1 hour. The backend server could be a reverse proxy such as NGINX, or simply an HTTP Server like Apache, Gunicorn, etc.
Once you've configured Varnish Cache, you can start it using the following command:
$ sudo systemctl start varnish
/etc/default/varnish
.You can also enable Varnish to start at boot time by running the following command:
$ sudo systemctl status varnish
This will display the current status of Varnish Cache, including its uptime and whether it is currently running.
● varnish.service - Varnish HTTP accelerator
Loaded: loaded (/lib/systemd/system/varnish.service; enabled; vendor preset: enabled)
Active: active (running) since Fri 2023-03-24 20:00:00 UTC; 5min ago
Main PID: 1234 (varnishd)
Tasks: 2 (limit: 2345)
Memory: 3.6M
CPU: 100ms
CGroup: /system.slice/varnish.service
├─1234 /usr/sbin/varnishd -j unix,user=vcache -F -a :80 -T localhost:6082 -f /etc/varnish/default.vcl -S /etc/varnish/secret -s malloc,64m
└─1235 /usr/sbin/varnishd -j unix,user=vcache -F -a :80 -T localhost:6082 -f /etc/varnish/default.vcl -S /etc/varnish/secret -s malloc,64m
Mar 24 20:00:00 myserver varnishd[1234]: Child (1235) Started
Mar 24 20:00:00 myserver varnishd[1234]: Child (1235) said Child starts
Mar 24 20:00:00 myserver systemd[1]: Started Varnish HTTP accelerator.
To further optimize Varnish Cache for your specific website, you can use more advanced configuration options. For example, you can configure Varnish to cache only certain types of content, such as images or CSS files, or to exclude certain content, such as dynamic pages that should always be generated by the backend server.
Here's an example of how to configure Varnish to cache images:
sub vcl_recv {
if (req.url ~ ".(png|jpg|jpeg|gif|webp)$") {
set req.backend_hint = your_image_server;
set req.ttl = 7d;
}
}
This code tells Varnish to cache all requests for image files with a TTL of 7 days. It also sets the "your_image_server" as the backend server to be used for serving the image content.
You can also configure Varnish to use different caching strategies based on the HTTP headers sent by the client. For example, you may want to cache content for authenticated users for a shorter period of time, while caching content for anonymous users for a longer period.
sub vcl_recv {
if (req.http.Authorization) {
set req.ttl = 10m;
} else {
set req.ttl = 1h;
}
}
This code sets the TTL to 10 minutes for requests with an Authorization header (authenticated users) and to 1 hour for requests without it (anonymous users).
Monitoring Varnish Cache
Once you've set up Varnish Cache, it's essential to monitor its performance to ensure that it's working correctly. Varnish Cache comes with a built-in tool called varnishstat, which provides real-time information on cache hits, misses, and backend requests.
To monitor Varnish Cache, run the following command:
$ varnishstat
This will display a summary of the cache performance metrics, including hit rate, cache size, and memory usage.
MAIN.cache_hit 2346452 9.52 Cache hits
MAIN.cache_hitpass 42 0.00 Cache hits for pass
MAIN.cache_miss 245951 1.00 Cache misses
MAIN.backend_conn 2170551 8.80 Backend conn. success
MAIN.backend_fail 0 0.00 Backend conn. failures
MAIN.backend_reuse 1421716 5.77 Backend conn. reuses
MAIN.backend_toolate 0 0.00 Backend conn. not attempted
MAIN.backend_recycle 1421720 5.77 Backend conn. recycles
MAIN.backend_unused 1633331 6.63 Backend conn. unused
MAIN.fetch_head 0 0.00 Fetch head
MAIN.fetch_length 206452 0.84 Fetch with Length
MAIN.fetch_chunked 0 0.00 Fetch chunked
MAIN.pools 20 Number of thread pools
MAIN.threads 80 Total number of threads
MAIN.threads_created 70 0.00 Threads created
MAIN.threads_destroyed 10 0.00 Threads destroyed
MAIN.busy_sleep 42 0.00 Number of requests sent to sleep on busy objhdr
This output displays various statistics related to Varnish 7, including the number of cache hits (MAIN.cache_hit
), the number of backend connection successes (MAIN.backend_conn
), and the total number of threads (MAIN.threads
). The values in the second column indicate the number of times that particular statistic has occurred, and the values in the third column provide a brief description of what each statistic measures.
Benchmarking Varnish Cache
To measure the performance benefits of Varnish Cache, you can run benchmarks comparing the load times of your website with and without Varnish Cache enabled. One way to do this is to use the Apache Bench tool (ab), which is a command-line tool for benchmarking HTTP servers.
To benchmark your website without Varnish Cache, run the following command:
$ ab -n 100 -c 10 http://your-website.com/
This command sends 100 requests with a concurrency level of 10 to your website and measures the average response time.
You will see an output similiar to this:
Server Software: Apache/2.4.18
Server Hostname: your-website.com
Server Port: 80
Document Path: /
Document Length: 1024 bytes
Concurrency Level: 10
Time taken for tests: 3.517 seconds
Complete requests: 100
Failed requests: 0
Total transferred: 117000 bytes
HTML transferred: 102400 bytes
Requests per second: 28.42 [#/sec] (mean)
Time per request: 351.729 [ms] (mean)
Time per request: 35.173 [ms] (mean, across all concurrent requests)
Transfer rate: 32.44 [Kbytes/sec] received
To benchmark your website with Varnish Cache enabled, run the same command but add the -H option to include a custom header in the requests, which will trigger Varnish Cache:
$ ab -n 100 -c 10 -H "Cache-Control: max-age=0" http://your-website.com/
This command sends the same requests but with a Cache-Control header that tells Varnish Cache to cache the content for the maximum time allowed.
Server Software: Apache/2.4.18
Server Hostname: your-website.com
Server Port: 80
Document Path: /
Document Length: 1024 bytes
Concurrency Level: 10
Time taken for tests: 0.986 seconds
Complete requests: 100
Failed requests: 0
Total transferred: 117000 bytes
HTML transferred: 102400 bytes
Requests per second: 101.42 [#/sec] (mean)
Time per request: 98.577 [ms] (mean)
Time per request: 9.858 [ms] (mean, across all concurrent requests)
Transfer rate: 115.31 [Kbytes/sec] received
By comparing the response times of the two benchmarks, you can see the website with Varnish Cache enabled had a significantly higher request rate (101.42 requests per second compared to 28.42 requests per second) and a lower time per request (98.577 ms compared to 351.729 ms). This means that Varnish Cache improved the website's performance by serving cached content directly from memory, reducing the load on the server and improving page load times.
Conclusion
Varnish Cache is a great tool to make websites faster. It reduces the workload on the server and helps pages load faster. You can set it up easily to cache popular pages and serve them directly from memory, which gives your visitors a better experience. You can also use advanced settings and monitor performance to optimize it for your website. Running tests can help you see how much faster your website can be with Varnish Cache.