caching - Varnish and ESI, how is the performance? -


im wondering how performance of th esi module nowadays? i've read posts on web esi performance on varnish slower real thing.

say had page on 3500 esi includes, how perform? esi designed such usage?

we're using varnish , esi embed sub-documents json documents. response our app-server looks this:

[   <esi:include src="/station/best_of_80s" />,   <esi:include src="/station/herrmerktradio" />,   <esi:include src="/station/bluesclub" />,   <esi:include src="/station/jazzloft" />,   <esi:include src="/station/jahfari" />,   <esi:include src="/station/maximix" />,   <esi:include src="/station/ondalatina" />,   <esi:include src="/station/deepgroove" />,   <esi:include src="/station/germanyfm" />,   <esi:include src="/station/alternativeworld" /> ] 

the included resources complete , valid json responses on own. complete list of stations 1070. when cache cold , complete station list first request varnish issues 1000 requests on our backend. when cache hot ab looks this:

$ ab -c 100 -n 1000 http://127.0.0.1/stations [...]  document path:          /stations document length:        2207910 bytes  concurrency level:      100 time taken tests:   10.075 seconds complete requests:      1000 failed requests:        0 write errors:           0 total transferred:      2208412000 bytes html transferred:       2207910000 bytes requests per second:    99.26 [#/sec] (mean) time per request:       1007.470 [ms] (mean) time per request:       10.075 [ms] (mean, across concurrent requests) transfer rate:          214066.18 [kbytes/sec] received  connection times (ms)               min  mean[+/-sd] median   max connect:        1   11   7.3      9      37 processing:   466  971  97.4    951    1226 waiting:        0   20  16.6     12      86 total:        471  982  98.0    960    1230  percentage of requests served within time (ms)   50%    960   66%    985   75%    986   80%    988   90%   1141   95%   1163   98%   1221   99%   1229  100%   1230 (longest request) $  

100 rec/sec doesn't consider size of document. 214066kbytes/sec oversaturates 1gbit interface well.

a single request warm cache ab (ab -c 1 -n 1 ...) shows 83ms/req.

the backend redis based. we're measuring mean response time of 0.9ms [sic] in newrelic. after restarting varnish first request cold cache (ab -c 1 -n 1 ...) shows 3158ms/rec. means takes varnish , our backend 3ms per esi include when generating response. standard core i7 pizza box 8 cores. measured while being under full load. we're serving 150mio req/month way hitrate of 0.9. these numbers suggest indeed esi-includes resolved in serial.

what have consider when designing system 1) backend able take load after varnish restart when cache cold , 2) resources don't expire @ once. in case of our stations expire every full hour we're adding random value of 120 seconds expiration header.

hope helps.


Comments

Popular posts from this blog

c# - how to write client side events functions for the combobox items -

exception - Python, pyPdf OCR error: pyPdf.utils.PdfReadError: EOF marker not found -