This article may contain excessive or inappropriate references to self-published sources. (June 2014) |
Varnish is a reverse caching proxy[2] used as HTTP accelerator for content-heavy dynamic web sites as well as APIs. In contrast to other web accelerators, such as Squid, which began life as a client-side cache, or Apache and nginx, which are primarily origin servers, Varnish was designed as an HTTP accelerator. Varnish is focused exclusively on HTTP, unlike other proxy servers that often support FTP, SMTP, and other network protocols.
Developer(s) | Poul-Henning Kamp, Redpill-Linpro, Varnish Software |
---|---|
Stable release | 7.5.0[1]
/ 18 March 2024 |
Repository | |
Written in | C |
Operating system | BSD, Linux, Unix |
Type | HTTP accelerator |
License | two-clause BSD license |
Website | varnish-cache |
History
editThe project was initiated by the online branch of the Norwegian tabloid newspaper Verdens Gang. The architect and lead developer is Danish independent consultant Poul-Henning Kamp[2] (a well-known FreeBSD developer), with management, infrastructure and additional development originally provided by the Norwegian Linux consulting company Linpro. The support, management and development of Varnish was later spun off into a separate company, Varnish Software.
Varnish is free and open-source software, available under a two-clause BSD license. Commercial support is available from Varnish Software, amongst others.
Version 1.0 of Varnish was released in 2006,[3][4] Varnish 2.0 in 2008,[5] Varnish 3.0 in 2011,[6] Varnish 4.0 in 2014,[7] Varnish 5.0 in 2016,[8] Varnish 6.0 in March 2018,[9] and Varnish 7.0 in September 2021.[10]
Architecture
editVarnish stores data in virtual memory and leaves the task of deciding what is stored in memory and what gets paged out to disk to the operating system. This helps avoid the situation where the operating system starts caching data while it is moved to disk by the application.
Varnish is heavily threaded, with each client connection being handled by a separate worker thread. When the configured limit on the number of active worker threads is reached, incoming connections are placed in an overflow queue; when this queue reaches its configured limit incoming connections will be rejected.
The principal configuration mechanism is Varnish Configuration Language (VCL), a domain-specific language (DSL) used to write hooks that are called at critical points in the handling of each request. Most policy decisions are left to VCL code, making Varnish more configurable and adaptable than most other HTTP accelerators.[citation needed] When a VCL script is loaded, it is translated to C, compiled to a shared object by the system compiler, and loaded directly into the accelerator which can thus be reconfigured without a restart.
A number of run-time parameters control things such as the maximum and the minimum number of worker threads, various timeouts, etc. A command-line management interface allows these parameters to be modified, and new VCL scripts to be compiled, loaded and activated, without restarting the accelerator.
In order to reduce the number of system calls in the fast path to a minimum, log data is stored in shared memory, and the task of monitoring, filtering, formatting and writing log data to disk is delegated to a separate application.
Performance metrics
editVarnish Cache can speed up information delivery by a factor of several hundred. To ensure proper operation and performance, Varnish exposes metrics that can be monitored in the following areas:[11]
- Client metrics: client connections and requests
- Cache performance: cache hits, evictions
- Thread metrics: thread creation, failures, queues
- Backend metrics: success, failure, and health of backend connections
Metric collection
editVarnish Cache ships with monitoring and logging tools. One of the most used is varnishstat which gives a detailed snapshot of Varnish's current performance. It provides access to in-memory statistics such as cache hits and misses, resource consumption, threads created, and more.[12]
varnishstat
editRunning varnishstat from the command line returns a continuously updating list of all available Varnish metrics. If the -1 flag is added, varnishstat will exit after printing the list one time.[13] Varnishstat can be used as a standalone tool to spot-check the health of the cache. In order to graph metrics over time, correlate with other statistics from across an infrastructure, and set up alerts about any problems that may arise, monitoring services can integrate with Varnish and collect varnishstat metrics.
varnishlog
editVarnishlog is a tool that can be used to debug or tune Varnish's configuration, as it provides detailed information about each individual request.
Load balancing
editVarnish supports load balancing using both a round robin and a random director, both with a per-backend weighting. Basic health-checking of backends is also available.[14]
Other features
editVarnish Cache also features:
- Plugin support with Varnish Modules, also called VMODs[15]
- Support for Edge Side Includes including stitching together compressed ESI fragments
- gzip Compression and Decompression
- DNS, Random, Hashing and Client IP-based Directors
- HTTP Streaming Pass & Fetch
- Experimental support for Persistent Storage, without LRU eviction
- Saint[16] and Grace[17][18] modes.
- If a server malfunctions and returns HTTP status code 500, Grace mode will ignore expiry headers and continue to return cached versions. Saint mode is for use when load balancing, where a failing server is blocked for a quarantine period and excluded from the server pool.
See also
edit- Web accelerator which discusses host-based HTTP acceleration
- Proxy server which discusses client-side proxies
- Reverse proxy which discusses origin-side proxies
- Comparison of web server software
- Internet Cache Protocol
- Guru Meditation – an error message used by Varnish
References
edit- ^ "Releases & Downloads". Retrieved 12 September 2024.
- ^ a b Feryn, Thijs. "1. What Is Varnish Cache? - Getting Started with Varnish Cache [Book]". O'Reilly Media. Retrieved 2023-10-22.
- ^ "Catalyst Advent Calendar - Day 14". www.catalystframework.org. Archived from the original on July 22, 2012. Retrieved Sep 4, 2020.
{{cite web}}
: CS1 maint: unfit URL (link) - ^ Smørgrav, Dag-Erling (Sep 20, 2006). "Varnish 1.0 released". Retrieved Sep 4, 2020.
- ^ Heen, Tollef Fog (Oct 15, 2008). "Varnish 2.0 released!". Retrieved Sep 4, 2020.
- ^ Heen, Tollef Fog (Jun 17, 2011). "Varnish 3.0.0 released". Retrieved Sep 4, 2020.
- ^ Karstensen, Lasse (Apr 10, 2014). "Varnish 4.0.0 released". Retrieved Sep 4, 2020.
- ^ "Varnish Cache 5.0.0 — Varnish HTTP Cache". varnish-cache.org. Retrieved Sep 4, 2020.
- ^ Poul-Henning Kamp (15 March 2018). "Varnish 6.0 Released". Retrieved 15 May 2018.
- ^ Poul-Henning Kamp (9 August 2022). "Varnish Cache 7.0.0 released". Retrieved 9 August 2022.
- ^ "Top Varnish performance metrics". Top Varnish performance metrics. Jul 28, 2015. Retrieved Sep 4, 2020.
- ^ "How to collect Varnish metrics". How to collect Varnish metrics. Jul 28, 2015. Retrieved Sep 4, 2020.
- ^ "varnishstat(1): HTTP accelerator statistics - Linux man page". linux.die.net. Retrieved Sep 4, 2020.
- ^ "BackendPolling – Varnish". Varnish-cache.org. Archived from the original on 2010-08-21. Retrieved 2014-07-18.
- ^ "VMODs Directory (Varnish Modules and Extensions) | Varnish Community". Varnish-cache.org. Retrieved 2014-07-18.
- ^ "Saint Mode". Varnish. Archived from the original on 7 May 2011.
- ^ "Grace Mode". Varnish. Archived from the original on 9 May 2011.
- ^ Feryn, Thijs (2017). Getting Started with Varnish Cache: Accelerate Your Web Applications. O'Reilly Media, Inc. p. 85. ISBN 9781491972229.
External links
edit- Official website
- Official commercial web site
- Notes from the Architect
- "You're Doing It Wrong", June 11, 2010 ACM Queue article by Varnish developer Poul-Henning Kamp describing the implementation of the LRU list.
- Varnish in Layman's Terms
- Varnish Cache How-To