Imagine this, (names made up or/and don't apply to current situation, names used/imagined only for purposes of explaination)
There's server called
daniweb.com, it's an API server. You can get quite chunk of data out of it. But for some reason, the one who orders the project, demands that we request data from API, on 24 different servers to spread the load. So both
joule.blabla.com and even
zulu.blabla.com have to contact
daniweb.com systems and request required data. First from Alpha server, then Beta, Charlie, Delta etc. until Zulu and then back to Alpha, simple rotation with letters.
So when Jack requests the data as first user ever, his query goes to Alpha server. Then Elise requests the data directly after Jack, and her requests goes through Beta, etc., but then after some time Jonathan requests data and his requests goes to Zulu server (all others have been used) and then Sally to Alpha, etc. etc., it would be rather easily modify the file or the record somewhere and just push the array forward, so it will move on each time someone requests data from the API.
BUT, let's say this company has 1 billion users (really crazy, but let's go with this) and there's immense amount of requests on initial page. Then editing file would sometimes result in error (cannot access while other programs are accessing it etc.) same for database, at some point error would come out that it's WAY to many changes and server will lag for response's time sake.
I personally don't have anything to show you, it isn't homework, and it isn't a job. This is a small example on small project I'm working. It will serve no more than 1000 selected users. But still I would like to know, if there's a way to save this without any possible problems.