Let's assume couple things.
I created a nice website based on compiled program language (Node.js, ASP, Java, C#, whatever). It works nicely. But I'd like to get an estimate on how much hardware I'd need. So I could run 1 hosting, and it would result in "this" amount of impact on servers (memory, processor cycles, networking etc.). But that's 1 person with 1 hired server. I'd like to test how would it be, if 10.000 people were on each of 10.000 servers.
Any way to find that out, before releasing it?
Primarily, I'd like to check if there are no "bumps" in performance. For example 1 user requesting a hard sequence won't be noticable, maybe a small bump in computer's graphs. But if 10.000 people request same resource, seemingly "nothing-to-see-here" imperfection, will turn into a nightmare.
Like I said, how can I profile what could possibly happen if 10.000 people, got on 10.000 different servers.
Edited 1 Week Ago by Aeonix