Another thing I've noticed is that sometimes the main forum page loads in fits. Most of it will load, and then it pauses.
One of the last things on the page is the part at the bottom where it says "Most Online Today: X. Most Online Ever: Y" and gives a date (July 2, 2007).
If that information is cached in some way, then it doesn't much matter to performance. But if it is dynamically checked every single time the front page loads, then I can see that being an extremely database-intensive function on a database that's around a decade old (lots of days to compare and calculate the busiest).
It might be worth looking at commenting that part out of the php script so as to turn it off.
Calculating the "Most Online Today" and "Most Online Ever" isn't really demanding there are 2041 member on the forum. Going through 2041 entry and making a count one field to get the number of online user to then store that value in a table is a matter of milliseconds. That is clearly not a source of lag.
The query to count the number of user online from the DB would be something like
SELECT COUNT(onlineStatus) FROM user WHERE onlineStatus = 1
A query like this going through 2041 user is a joke.
Anyway, the problem seems to be solved for now, until the number of post gets to high again.
I said it *could* be demanding because you don't know without knowing how the query is written or which table indices exist. The query you provide assumes there is a field indicating online status. Do we know that exists? No, but suppose it does. Then that is in itself creates load.
If such a field exists to be accurate as you portray it, then 2,041 users must be updated to reflect whether they are online or not. We know that the timeframe is "in the last 15 minutes," so how do you do that? Again, there are good and bad ways to keep that info accurate.
In any case, however, that is not the part I mentioned as being problematic (unless it's cached). The trickier part is "most online ever." A simple flag such as you posit does not allow that to be calculated. Again, there are faster ways of keeping track of such information and slower ways. Without knowing the internals of the database it's speculation as to whether this value is calculated in an efficient method or not. Two thousand users is not a lot, but when you get to 2,000 users x 365 days x a dozen years, you could conceivably be dealing with almost 9 million records to check. (Besides table scans, something that can crush a database is a JOIN query because of the multiplicative growth involved.)
Frankly, I don't see why this forum should have problems when nowadays fewer than 50 people are on it at any given time. More guests (likely search bots) are on it, than actual users. One way of improving performance, therefore, might be to block all guest access.
The image heavy threads that refer to external sources for their data also can hold a Web server process for a long time, though they shouldn't directly impact the speed of queries.
The reason I offered the footer as a possible problem is as I explained: from observation. I have noticed many times that the first part of the forum loads quickly and then it hangs. I looked at what wasn't loaded yet to see what sorts of queries might still be loading at that point. IMO knowing the most people ever online is fairly useless as it hasn't changed in almost 10 years. So *if* it's a burden to calculate--which I conceded I don't know--I'd punt it.
FWIW, I don't consider myself a database expert, but I have managed databases with 200 gigs of data and 200,000 users before, 40,000 of whom were still active. It's difficult to guess the causes of a problem in a blackbox situation because we don't even know the resources available to the forum. As was demonstrated last week, if you limit the resources sufficiently (i.e., one process), then even 20 simultaneous users are miserable.