I have your average PHP app (running on Windows server) with forms and data grids/lists and some of them require running pretty complex queries that I've optimized to the max, and I doubt there's a ton that can be done to make them run much faster. I also don't have the option of changing the database structure, given other processes that depend on the structure. So since caching hasn't really been used much in the app, that seems to be the next logical step.
I recently read up on generational caching and came up with a decent mechanism to automate caching of queries in my apps. My issue now is that I'm running into size limitations for both options that appeared to be logical choices. WinCache limits you to a total of 85MB, which isn't going to cut it, and memcached limits an item to 1MB, which doesn't seem like much if you have a query that returns a fairly large number of records and has a lot of fields. OK, to be exact, seems like memcached now allows you to set a larger size, but the mere fact that it's 1MB by default and used to only allow that makes me question what I'm trying to do.
The maximum number of records my grid allow to return at once is 1000 records, so that's the maximum number of records that could get stored in the cache (fields per record vary, of course). I know that a huge number of users would probably mean the cache would fill up very quickly, but the number of concurrent users is usually not huge and from what I read, if memcached runs out of memory it'll just get rid of the oldest cached items, so I don't see a big downside with storing larger data sets, unless the operation is very expensive, but from what I've read it doesn't seem to be the case.
So in summary, what I'm wondering is if it's a bad idea to store larger data sets in memcached (and granted, I know I don't want to store a query with a million records in there). And if it's a bad idea, what would be a good alternative for caching/improving performance when retrieving those data sets?