Best strategy for caching Woocommerce products and ESI debugging

#1
Hi,

We are developing a Woocommerce webshop, hosted on openlitespeed. We use the LS Caching plugin for wordpress, and have a QUIC cloud subscription enabling us to use ESI. We are trying to determine what is the best strategy to always serve up-to-date product- and archivepages from the litespeed cache. I will lay out some crucial points here:
  • We use a modern block-theme, using Gutenberg blocks or (custom) shortcodes to render information to the frontend.
  • Our prices and availability of the product are updated at least once a day, from an external service that connects to the Woocommerce API of our site.
  • We have different customer roles, that should be served different prices.
  • We always want to serve fast pages from the cache.
The questions that I have are the following:
  1. What markup should I see on the source code of the page when ESI placeholders are placed, but are not replaced with cached content? We are having troubles getting ESI to work with our blocks.
  2. How do we implement ESI on blocks like the stock Woocommerce Gutenberg priceblock, and shortcodes? We use both to display price and availability respectively and neither one works as expected.
  3. Will an update of one or multiple products through the Woocommerce API purge the litespeed cache?
    1. If so, does it purge only the applicable page and archive page or does it purge the whole site?
  4. If we want to make sure the cache for a productpage is refreshed after it has been updated, can we just do a backend CURL or WGET and if so, do we need to send specific cookies or request headers?
    1. If we want to do this for multiple roles in order to make sure each role's ESI block is also prepared in cache, how do we do that?
  5. Can we refresh the cache of specific ESI blocks by ID or hash, without refreshing the entire parent page's cache?
    1. If so, how do we retrieve the ID or hash for this ESI block?
Our goal here is to try and avoid having to do this through AJAX, since that will result in a jittery frontend and since we have openlitespeed and can use ESI because we have the QUIC cloud subscription, we'd like to leverage that power.

Thanks in advance.
 
#3
Thanks for that link. although I had already looked through the docs, I hadn't read this one yet. There are multiple sections on ESI on the documentation, so it is hard to know when you've read it all. Which of the questions do you consider Wordpress related?
 
#4
The docs you pointed to don't tell me how to refresh the cache for a specific page(s) via cURL or WGET. I tried that, but it doesn't work, probably since my request doesn't match the browser's request enough (which does trigger a cache-refresh). Could you tell me how to create a CURL or WGET (or some other type of CLI request tool) that would rebuild the cache for a URL?
 
#5
General curl command should warm up the cache, for example.

Code:
curl -sILk https://lscache.io/?abc | grep cache
x-qc-cache: miss

curl -sILk https://lscache.io/?abc | grep cache
x-qc-cache: hit
As you are using a WordPress site, using LSCWP to warm up the site regularly is also a solution.

Do you have any example URLs?
 
#6
Hm, it seems you're right! Something must have been different in our tests, will need to examine what exactly. We do use the LSCWP crawler, but our products are updated so frequently that the crawler can't keep up and we still have 'gaps' between a product update and a next crawler visit.

I see something strange: One or our URL's is

https://123keuzehulp.nl/verwarming/elektrische-vloerverwarming/elektrische-verwarmingsmat/

When I curl the URL twice from the command line, the first one produces a missed cache, and the second one a hit, as expected. But, repeated requests through the browser (both chrome and firefox) keep producing cache misses.
 
#7
Right, the site on my browser is always a cache miss, so it's not a curl problem. If you have a dev/staging site, try to bypass CDN, and use a simple rewrite rule to cache all like 10 seconds and see if the cache works or not on browser. If it works, it means the server config is good, and you need to check your site side. If you have to debug the issue on the production site, please try to check the debug log, maybe something keeps varying it or purging it.
 
#8
@Majorlabel

1. Do not use curl without additional parameters to verify if caching works. This is especially true as you have the Guestmode fetaure enabled that needs Javascript enabled to set the _lscache_vary Cookie for the Guestmode.
2. Disable the Guestmode feature for time being as this cookie is set twice, but with different values. These different values may be the reason for wrong cache behavior. And btw. The Guestmode feature is also known as "Cheatmode". It tries to cheat der PageSpeed score, but with any performance benefit.
 
#9
Right, the site on my browser is always a cache miss, so it's not a curl problem. If you have a dev/staging site, try to bypass CDN, and use a simple rewrite rule to cache all like 10 seconds and see if the cache works or not on browser. If it works, it means the server config is good, and you need to check your site side. If you have to debug the issue on the production site, please try to check the debug log, maybe something keeps varying it or purging it.
We've disabled CDN, but the problem persists. How would you implement a rewrite rule to do the 'simple caching'? On the virtual host level the htaccess file is controlled by the lscache plugin, would you just place a simple caching rule at the top, temporarily?
 
#11
OK, so the situation now is:
  • Repeated cURL requests don't cache at all, keep returning the "x-litespeed-cache-control: public,max-age=604800" header and no litespeed cache hit or miss header.
  • Repeated requests through the browser I also use to login to the wordpress backend don't cache at all, keep returning "x-litespeed-cache-control no-cache,esi=on" and no litespeed cache hit or miss header.
  • Requests through chromium (which I never use) or firefox incognito do cache, and return a hit/miss header
So it seems caching works, but some cases don't. Could you help pinpointing the issue? After that we can start looking at refreshing the cache on the fly.
 
#16
There are no fixed parameters as it depends on plugin settings. As long as you have guest mode enabled, curl won't work correctly.
So with guest mode enabled, each cURL will be treated as a new guest because the cURL request doesn't send a session cookie?

And, question 2: do you have a set of frequently used request headers that can be used and how they map to the options of the LSCache plugin for Wordpress options? That way we can start experimenting.
 
#17
So with guest mode enabled, each cURL will be treated as a new guest because the cURL request doesn't send a session cookie?
Cookie yes, but not session cookie. It's the _lscache _vary cookie that has a unique value for each WP installation.

So with guest mode enabled, each cURL will be treated as a new guest because the cURL request doesn't send a session cookie?

And, question 2: do you have a set of frequently used request headers that can be used and how they map to the options of the LSCache plugin for Wordpress options? That way we can start experimenting.
OLS and the cache plugin are open source, but not everything is free. If you don't know how LScache works and which parameter is needed for which function you don't know what you want to test. But everything can be read from the source code of the cache plugin.
 
#18
Cookie yes, but not session cookie. It's the _lscache _vary cookie that has a unique value for each WP installation.



OLS and the cache plugin are open source, but not everything is free. If you don't know how LScache works and which parameter is needed for which function you don't know what you want to test. But everything can be read from the source code of the cache plugin.
I'm not sure what you mean by "not everything is free", I assume you mean the knowledge you have? I will investigate for myself then.
 
#19
I'm not sure what you mean by "not everything is free", I assume you mean the knowledge you have? I will investigate for myself then.
"Not free" means that you cannot expect to receive knowledge that others have invested a lot of time in acquiring.

The only important thing regarding your actual problem is that it was solved after you followed my advice.
 
#20
I do not expect anything. It is my experience, from contributing to open-source myself, that the community thrives on sharing knowledge. It is time, that is valuable. It is not up to me to decide for you what you want to share. If it is very important for you to believe you were the key to solving the problem, please do so. In reality, it was the fact that the curl request sent a HEAD request, not a GET request. But believe whatever makes you happy man.
 
Top