We'll be launching a mobile app (iOS, Android) soon. App is native with a local SQLite database. App is supposed to do scheduled database sync with a live online MySQL database connected to a PHP based backend system hosted on a Hostgator dedicated machine.

App audience is expected to be 10,000+. Now, as worst case scenario, if all of these apps/clients start sync at once, I think it'll be a disaster. It'll be like 10,000 visitor accessing a site at once and there may be outage or even server may go down.

A single sync request doesn't take much. Data returned is a few KBs.

Regardless of mobile app, API, PHP and MySQL optimizations, what should we optimize on server to sustain this spike of load? The bandwidth? uplink? simultaneous HTTP connections? simultaneous database users? What you suggest what and how much for this scenario?

You are only using 1 server for 10,000+ users? It is doable - we use about 5000 servers to handle approximately 100M mobile browser users (20,000 per server), but we have multiple processes on each server (8+ cores, 64GB RAM each) to handle the load. Also, of those 100M active users, only about 20% of those are active at any time, since they are spread all over the world. So, 20M users on 5000 servers == 4000 concurrent users per server. At 8 cores per server, that would be about 500 users per core.

So, this is NOT a simple problem. Architecture (and redundancy) is everything! In any case, assuming your client base is not in a single location (geographically speaking), then it is highly unlikely that all of them would try to sync at the same time.

Another critical factor is the data load each user syncing will consume - how big are your network pipes to your peering partner? My guess is that you will saturate your internet connections LONG before you overload your server(s).

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.