Quote Originally Posted by xero View Post
I'm not making a mistake per say, i'm just not taking into account their environment, even if they are in AWS and dynamically scaling and all that jazz, the simplest solution is still the same. There's just some additional wrinkles due to things like logical physical hardware mappings, replication, and other wonderful software "wacky-ry" that will complicate things a bit.

The cache can be distributed. Sharding it on say... user hash would do the trick, the hash mapping to a virtual node so that the physical node underneath can be replaced through some sort of magical re-balance/re-shard operation. They probably don't need to dynamically scale for this deployment. I wouldn't be surprised if its still setup to do so, since its pretty common for web based games, having a couple under their belt already they probably design to make it possible to scale as a matter of standard procedure at this point... (its a good feature to have available incase your player base suddenly explodes, or dwindles... which of course it does naturally over the course of each and every day)

Basically I was assuming they have storage/cache that is reliable and available for each player, and i'd say that's pretty reasonable. considering you have an inventory, stamina and life counters, cards, and many other things stored/cached that are specific to you, just tack the results from the query on to the end of that. =)
The problem is that doing things in web services like this is best geared by moving to as much of a REST architecture, where you don't cache this kind of things, as possible.
Caching, and sharing data between distributed parts of a server, are such complex things that there are whole companies and profesional groups just dedicated to that, and they can't fix it good enough yet.
Believe me, caching trivial transactional data in a distributed web server is as hellish hard as you can go.