Super slooooooooooowwwww site

All topics related to functioning of this forum or MakeMKV website
privacy_user_mkv
Posts: 2
Joined: Wed Feb 11, 2026 4:40 am

Re: Super slooooooooooowwwww site

Post by privacy_user_mkv »

I also noticed the site being unresponsive. I actually work in tech on large websites. Maybe I could help!
Woodstock
Posts: 10841
Joined: Sun Jul 24, 2011 11:21 pm

Re: Super slooooooooooowwwww site

Post by Woodstock »

Possibly, but...

Part of the problem seems to be Cloudflare, and a big part seems to be that thousands of individual IPs sometimes seem to attack the forum. For example, currently, there are 3600 "guests" (unlogged in users) accessing the page. In terms of people, we're talking maybe 50 to 100, the others are machines doing various things. This doesn't include the Google BOT indexing the forum... it's relatively benign.

Cloudflare has some major screw-ups attributed to it from time to time, but the site itself is on a shared hosting server that has multiple clients to serve, so there is a limit to what you can get for an individual client.
kaysee
Posts: 95
Joined: Wed Apr 07, 2021 12:22 am

Re: Super slooooooooooowwwww site

Post by kaysee »

The Cloudflare error pages that were displayed over most of yesterday indicated that the connection through Cloudflare was fine, but the forum server was not responding. That suggests that Cloudlare was responding normally, and probably fighting DDOS attacks as usual, but the forum server was having problems.
AstralWanderer
Posts: 10
Joined: Wed Feb 04, 2026 10:45 am

Re: Super slooooooooooowwwww site

Post by AstralWanderer »

Been encountering a cluster of Cloudflare 522 and 525 errors recently, with their error page pointing the finger at forum.makemkv.com.

Given the large number of reported visitors, I had a look at makemkv's robots.txt file - and, aside from comments, it's empty. No restrictions or limitations on search engine spiders or AI scrapers at all. Now there are presumably restrictions set via Cloudflare, but having some in robots.txt would reduce initial connection attempts (at least from well behaved crawlers).

So cutting and pasting another robots.txt, say from the example listed on Using robots.txt in order to optimize your site performance and reduce your website load (apologies if this seems somewhat spammy and WordPress focused, but with others focused on SEO/marketing, this seemed the least worst) could help matters significantly.
AstralWanderer
Posts: 10
Joined: Wed Feb 04, 2026 10:45 am

Re: Super slooooooooooowwwww site

Post by AstralWanderer »

Quick update - forum performance is back to normal - so whatever the problem was, seems to have been fixed. Thanks.
zittrig
Posts: 301
Joined: Sat Jan 09, 2021 5:45 pm

Re: Super slooooooooooowwwww site

Post by zittrig »

Yesterday in the evening the forum was like a turn signal: Works, works not, works, works not, works ..... :mrgreen:
Coopervid
Posts: 4320
Joined: Tue Feb 19, 2019 10:32 pm

Re: Super slooooooooooowwwww site

Post by Coopervid »

Same thing again now.
flojo
Posts: 338
Joined: Thu Jun 22, 2023 4:27 am
Location: El Paso

Re: Super slooooooooooowwwww site

Post by flojo »

robots.txt won't help, in fact it never did. At least 1 Google bot always scraps without identifying as a bot, but they might not serve the results based on robots.txt (depending on advertising). Ever wonder how Google knows a site is relevant, but still summarizes the site something like "This website has chosen not to show..."? So the Google bot reads robots.txt, but Google could only know it's a relevant site because Google scraped it.

With everyone and their mother perpetually and for the rest of time re-re-re-re-training AI with they're own licensed agent software, I can only imagine how frustrating it is to admin a site. Of course we all get to feel all websites slowing down because of it, now and forever.
Post Reply