Search results

  1. O

    Object cache Redis causing reset password link invalid

    Hi @Cold-Egg , Thanks for the reply, sorry forgot to mention, I use this plugin: https://github.com/theme-my-login/theme-my-login/releases/tag/v6.4.16 Not sure if this is the cause you can't reproduce the issue on your server. I've tried your suggestions, disabling "Persistent Connection"...
  2. O

    Object cache Redis causing reset password link invalid

    Hi, I'm using DigitalOcean's 1-click WordPress OLS droplet. When a new user registers, he needs to reset his password, and the link to reset his password is sent by email. The problem is this link is always invalid when tried to open. I noticed that if I wait exactly 6 minutes from the time of...
  3. O

    LScache crawlers causing error: Too many open files

    Thanks @lsqtwrk , really appreciate the help! Will try that if the file numbers keep growing.
  4. O

    LScache crawlers causing error: Too many open files

    That's also my thought on this, currently the files number from cachedata is 9877, but this morning, assuming each directories contained 5000 files, cachedata should contained around 80000 files (that's the low number). I thought about creating a cronjob to clear that up, but I noticed these...
  5. O

    LScache crawlers causing error: Too many open files

    Hi @lsqtwrk , From "/usr/local/lsws/cachedata/priv" I run: find . -xdev -type f | cut -d "/" -f 2 | sort | uniq -c | sort -n and got this response: 525 f 533 3 538 d 552 b 555 c 565 5 569 4 571 a 572 e 585 6 588 0 593 8 595 1 616 9...
  6. O

    LScache crawlers causing error: Too many open files

    Just an update, after about 2 weeks of smooth sailing, today I got the same error as I've mentioned at the start, Too many open files, and the server restarted gracefully. This error happened around the time the crawler about to finish it's run, maybe 2-3 minutes into it. I've updated to OLS...
  7. O

    LScache crawlers causing error: Too many open files

    Thanks @lsqtwrk , for now I'll keep an eye on the logs, it's been 3 days with the setup suggested, and so far so good, not a single "Too many open files" error.. Currently using version 1.5.10, I do get notified of the new version 1.6.5, but I think I'll wait until 1.6.x becomes available at...
  8. O

    LScache crawlers causing error: Too many open files

    Hi @lsqtwrk , Thanks so much for the tips, here's the output of cat /proc/XXX/limits : Limit Soft Limit Hard Limit Units Max cpu time unlimited unlimited seconds Max file size unlimited unlimited...
  9. O

    LScache crawlers causing error: Too many open files

    Hi @Cold-Egg , Crawler settings below. Thanks.
  10. O

    LScache crawlers causing error: Too many open files

    Hi @Cold-Egg , Thanks for replying. I've increased the "Max Connections" and "Max SSL Connections" to 20000 / 10000 respectively. My server spec: 4vCPUs, 8GB/160GB disk. I've also attached a screenshot of lsphp parameters below (I hope this is what you need, let me know if I'm wrong). For...
  11. O

    LScache crawlers causing error: Too many open files

    Hi, I'm using DO's 1-click WP OLS droplet. Every time the crawlers start running, I get these types of errors: [INFO] [uds://tmp/lshttpd/lsphp.sock] Connection error: Too many open files, adjust maximum connections to 34! [ERROR] [*:443] HttpListener::acceptConnection(): Accept failed:Too many...
Top