Hey there! any question in your mind? Ask It Now!.

Popular Categories





How can We cache medium/large data sets in PHP?

0 votes
105 views
asked in Programming by jack (1,440 points)

I have your average PHP app (running on Windows server) with forms and data grids/lists and some of them require running pretty complex queries that I've optimized to the max, and I doubt there's a ton that can be done to make them run much faster. I also don't have the option of changing the database structure, given other processes that depend on the structure. So since caching hasn't really been used much in the app, that seems to be the next logical step.

I recently read up on generational caching and came up with a decent mechanism to automate caching of queries in my apps. My issue now is that I'm running into size limitations for both options that appeared to be logical choices. WinCache limits you to a total of 85MB, which isn't going to cut it, and memcached limits an item to 1MB, which doesn't seem like much if you have a query that returns a fairly large number of records and has a lot of fields. OK, to be exact, seems like memcached now allows you to set a larger size, but the mere fact that it's 1MB by default and used to only allow that makes me question what I'm trying to do.

The maximum number of records my grid allow to return at once is 1000 records, so that's the maximum number of records that could get stored in the cache (fields per record vary, of course). I know that a huge number of users would probably mean the cache would fill up very quickly, but the number of concurrent users is usually not huge and from what I read, if memcached runs out of memory it'll just get rid of the oldest cached items, so I don't see a big downside with storing larger data sets, unless the operation is very expensive, but from what I've read it doesn't seem to be the case.

So in summary, what I'm wondering is if it's a bad idea to store larger data sets in memcached (and granted, I know I don't want to store a query with a million records in there). And if it's a bad idea, what would be a good alternative for caching/improving performance when retrieving those data sets?

1 Answer

0 votes
answered by PRASHANT PATEL (300 points)
You haven't mentioned that what database you are using.

May be you should try caching those records by encoding in JSON format and retrieve them back and decode from JSON.

If you can try using mongo db.

Related Questions

+3 votes
3 answers 2,451 views
+6 votes
1 answer 238 views
+2 votes
1 answer 139 views
+1 vote
2 answers 5,245 views
+1 vote
2 answers 78 views
+4 votes
5 answers 718 views
0 votes
1 answer 9,844 views
+2 votes
3 answers 284 views
asked in Programming by jatin Expert (3,823 points)
+2 votes
1 answer 146 views
+4 votes
1 answer 615 views

Not a Member yet?

Ask to Folks Login

My Account
543 Folks are online
61 members and 482 guest online
Your feedback is highly appreciated