LScache crawlers causing error: Too many open files

oxcid

New Member
#1
Hi,

I'm using DO's 1-click WP OLS droplet. Every time the crawlers start running, I get these types of errors:

[INFO] [uds://tmp/lshttpd/lsphp.sock] Connection error: Too many open files, adjust maximum connections to 34!
[ERROR] [*:443] HttpListener::acceptConnection(): Accept failed:Too many open files!
[ERROR] [128.xxx.xxx.xx:45868#wordpress] [Module:Cache] createEntry failed.
[128.xxx.xxx.xx:45880#wordpress] [Module:Cache] createEntry failed for update stale.

Which in the end sometimes resulted in:

[NOTICE] There are 73 '503 Errors' in last 30 seconds, request a graceful restart

Which is good if successful, however sometimes the restart is not done, and I get a lot further of 503s. I have tried to increase the limit as suggested by this (I even put 10x the number suggested):

https://docs.litespeedtech.com/cloud/wordpress/#how-do-i-fix-a-too-many-open-files-issue

But it only succeeded in "postponing" the errors, not completely fixed it. If at first, every time the crawlers running I get this errors, now the errors only show up after 2-3 days, and after several crawlers run (I set the crawlers to run in 8 hours interval, it's a frequently updated e-commerce site).

Has anybody else experience this, or has an idea what to do? I really don't want to restart the droplet every day.

Thanks in advance.
 

Cold-Egg

Administrator
#2
Hi @oxcid ,

Please try to increase the HTTP/HTTPS connections as well if it's a small number, e.g. set to 10000 or more.
May I ask the server spec and what's your current PHP parameters on web admin settings.
Also what's the crawler settings on lscache plugin now, a screenshot or config share may helps.

Best,
Eric
 

oxcid

New Member
#3
Hi @oxcid ,

Please try to increase the HTTP/HTTPS connections as well if it's a small number, e.g. set to 10000 or more.
May I ask the server spec and what's your current PHP parameters on web admin settings.
Also what's the crawler settings on lscache plugin now, a screenshot or config share may helps.

Best,
Eric
Hi @Cold-Egg ,

Thanks for replying. I've increased the "Max Connections" and "Max SSL Connections" to 20000 / 10000 respectively.

My server spec: 4vCPUs, 8GB/160GB disk. I've also attached a screenshot of lsphp parameters below (I hope this is what you need, let me know if I'm wrong).

For the crawlers settings, I'll get back to you on this when I get home.

Thanks.

Screenshot 2020-01-10 00.10.02.png
 
#5
Hi,

please try

for [ERROR] [*:443] HttpListener::acceptConnection(): Accept failed:Too many open files!

cat /proc/XXX/limits

where XXX is the PID of OLS process , you can get it by ps -aux | grep openlitespeed

see verify the actual limits it has

for [INFO] [uds://tmp/lshttpd/lsphp.sock] Connection error: Too many open files, adjust maximum connections to 34!

in your screenshot , try increase Max Connection and PHP_LSAPI_CHILDREN=35 , from 35 to a higher number , like 40 or 50


and see if it helps
 
#6
Hi @lsqtwrk ,

Thanks so much for the tips, here's the output of cat /proc/XXX/limits :

Code:
Limit                     Soft Limit           Hard Limit           Units
Max cpu time              unlimited            unlimited            seconds
Max file size             unlimited            unlimited            bytes
Max data size             unlimited            unlimited            bytes
Max stack size            8388608              unlimited            bytes
Max core file size        unlimited            unlimited            bytes
Max resident set          unlimited            unlimited            bytes
Max processes             31803                31803                processes
Max open files            100000               100000               files
Max locked memory         16777216             16777216             bytes
Max address space         unlimited            unlimited            bytes
Max file locks            unlimited            unlimited            locks
Max pending signals       31803                31803                signals
Max msgqueue size         819200               819200               bytes
Max nice priority         0                    0
Max realtime priority     0                    0
Max realtime timeout      unlimited            unlimited            us
I think we're on to something here, as I see the "Max open files" value is 100000, while the output of ulimit -n is 327680, so there's a difference. Do you know what caused this? I set the value through /etc/security/limits.conf file.

For the Max Connections and PHP_LSAPI_CHILDREN, I've set them to 40.

Thanks.
 
#7
well , you have 100k on open files , that should be more than enough.

ulimit -n is showing the user's limit , not the process' limit, that's why you see it different

but I think either 100k or 327k , it should be more than enough for single process to use.

if you still see that issue , please try edit file /etc/systemd/system.conf , find #DefaultLimitNOFILE= and change it to DefaultLimitNOFILE=327680 save the change and reboot server

I don't think this is related, but just please give a shot.


Also if you launched from market image , please make sure you have updated to latest OLS version.

https://openlitespeed.org/kb/install-from-binary/

try 1.6.5 or 1.5.10
 
#8
well , you have 100k on open files , that should be more than enough.

ulimit -n is showing the user's limit , not the process' limit, that's why you see it different

but I think either 100k or 327k , it should be more than enough for single process to use.

if you still see that issue , please try edit file /etc/systemd/system.conf , find #DefaultLimitNOFILE= and change it to DefaultLimitNOFILE=327680 save the change and reboot server

I don't think this is related, but just please give a shot.


Also if you launched from market image , please make sure you have updated to latest OLS version.

https://openlitespeed.org/kb/install-from-binary/

try 1.6.5 or 1.5.10
Thanks @lsqtwrk , for now I'll keep an eye on the logs, it's been 3 days with the setup suggested, and so far so good, not a single "Too many open files" error..

Currently using version 1.5.10, I do get notified of the new version 1.6.5, but I think I'll wait until 1.6.x becomes available at the repo.

Thanks again!
 
Top