I've written a piece of software that connects to a server's SQL database.
The software has a timer which fires every second, and the timer connects to the database every time and pulls out a list of current to do's from a table (CurrentToDoList). This database only has maximum of 100 rows in it, so not a huge demand on the server.
But now this software has been installed on several computers, each firing each second, etc.
And then there are the other events that connect to other databases on the server which are more taxing.
So far these are rhetorical questions, but I'm wondering how much of this can the server take? How many computers can run simultaneously before the server starts draining too many resources and not keeping up with the timer event.
So now the real question: What will happen if the server is too slow to deliver the information before another timer event fires?