I have a website that gets a lot of visitors a month. And the server cannot handle the traffic. I have a dedicated server. How can I run the site on multiple dedicated servers?

Recommended Answers

All 16 Replies

This is what load balancers and cache servers are for.

While I have not set anything up myself, it is my unerstsanding that you have a few options from using a service like Akamai to having a load balancer as your handling server and then pass requests to servers that are not currently under load.

Whatever it is, you're gonna have to invest in infrastructure.

Congrats on the success, and good luck! :)

Thanks. Should I ask my hosting provider to do it for me, or should I do it myself? Because I contacted them and they told me that I need to do it myself, but it seems like it's a job they are responsable for. Don't you think?

nope. Not at all their problem. It will become their problem if your site crushes their bandwidth, but it's likely your server will struggle long before they notice your site as a blip unless youre running multi-cast video or high bandwidth data.

You know.. an alternative to your issue might be just optimization...

What is it your site does that you are noticing slow downs? (if you don't mind me asking).

If it's just serving up web pages, the volume of traffic youre getting has to be insane to slow you down.

I built the site myself using PHP as a server side language. I know that there is a lot of processing going on, so now I am trying to optimize my pages for less processing. But I am expecting a lot of traffic in the following year or so, so I have to be ready.

Its unlikely that PHP is your bottleneck. Are you using a database? If so, I would start there and see if you have any particularly long running processes that you can optimize.

Sometimes, just changing a query from select * from .... to specific columns select a,b,c from .... can offer a performance boost depending on table size and what you are returning.

If you are not using a database, then yes - whatever you are processing server side would need to be optimized - but I am still interested in what you would be doing to cause that much traffic on just PHP alone.

Anyway, best of luck! I hope I helped.

The site has MySQL database. The database does not have a lot of tables, though some tables have a lot of records. Thanks for advice I will optimize my database, I think that it's where the problem rise.

The site is about memes by the way. I guess you know how much traffic these sites get. I just discovered that my site cannot handle more than 150 users at a time. I usually get about 190+.

I am still learning about load balancers and cache. Thanks for that too :)

is that 150 per second?

If not, then you have something seriously weird happening that is taking a long time to complete. Are these persistent connections (like sockets)?

You also have to consider your hardware and bandwidth (or if there are caps from your provider).

So many things to consider that all come with a cost/benefit analysis...

If you are using a service like GoDaddy, they put hard limits on their VMs, even "dedicated" vm servers. Other services may as well.

Ultimately, if you are going to have this sort of sustained traffic, you may want to see if an AWS or Azure cluster would be more worth your time. That way you get load balancing and global scalability in one (it just comes with a cost that you will have to determine if it's worth it).

If you own the hardware and are serving it yourself, thats where you want to look into load balancers in front of series of web servers, or Akamai or similar to basically do what AWS/Azure would do for you. Otherwise, it might be just as simple as paying a little extra to bump your bandwidth pipe to handle the data.

It all comes down to your particular need and how much you want to invest...

I was just learning about load balancers, and I found that it's a peice of hardware, and it could be a software too. So there are too options. What do you think is better?

By the way I am hosting the site on Hostgator.

And thank you very much ryantroop :)

I'd pay more attention to the slow query log and simple page stats from your logs than hunting down select * queries.

The slow query log will highlight all queries that take more than x time (configurable by you). Your logging system or analytics should tell you which pages take longest to render.

From my experience, it's likely that there will be a couple of pages that hammer the database. Either building up a complex dataset server-side by hammering the database with n+1 queries or doing a few big selects then manually manipulating the data.

So, in short, find the slowest pages/actions. Make them more efficient and faster. Rinse, repeat.

Im not sure what a "software load balancer" would do other than put up a cache state for repeat requests... but that's just because I can't conceptualize it.

However, the suggestions for looking at your SQL and how to find slow queries from everyone else is spot on and it is likely a good place to start in terms of optimization. Before you HAVE to spend money, see if you can just fix the stuff yourself :)

+1 pty - never knew mySQL had that available. Been livin in a TSQL world (with a profiler) for so long I never really had to look that up :)

150 users at a time in a server ? I understood from your answers that your code is a mess , but 150 users at a time ? First understand what the server is for , how many gigabytes ram has , how do you use them , how many cores CPU e.t.c. , then learn some basic programming techniques like server side caching. If you do all that and you have more than 10.000 concurrent users in a good server it is time to scale up. But with 150 concurrent users (that make the same request in the same millisecond) it is time to learn some basics and to consider your dedicated hosting environment.

Okay guys, thank you all for your response. I am working on my queries and code right now and minimizing database requests. I want to be ready incase I got a lot of visitors, that's it. I tried searching on Google, but I could not find an answer ...

Do you think I can hire someone to do this for me on Upwork or any other freelancing website?

Because I contacted my hosting provider and they have no clue on how to do it, software or hardware wise.

TL;DR: Focus first on where your code is stalling requests. Fix them. If you can't possibly change your code, then invest in hardware. Set it up yourself or by someone you trust / hire / contract (and after they do it, learn as much as you can to support it in case they leave or do something wrong, or you ever need to scale again).

Anything is possible. However, once you set up an infrastucture and investment, do you really want some random person who you paid a minimum bid being the one responsible for controlling your network? IMO that's asking for trouble - especially if your site does become very sucessful and you have to scale and are then faced with trying to find someone who had the same frame of mind and expertise as the person who set it up, as well as taking on all security risks associated with random implementations.

A single VPS/Dedicated Server should not be limited to 150 - 200 concurrent requests / sec. The "theoretical" hard limit is ~65,000 requests on TCP (which is what HTTP requests are based on) and seeing how those should be open and closed as fast as possible, you should be able to handle the kind of traffic you are seeing. If you arent, there is likely a code bottleneck somewhere - and when you find it and alleviate it, you can then look at how to best start scaling upward and outward. (NOTE / DISCLAIMER: Im not saying you should be able to handle 65,000 requests per second. However, you have that many resoures "available", and it all comes down to your hardware, your code, and how fast you process requests for data. Realistically, if you are processing 150 requests per second you should be "fine" with "modern" hardware assuming your requests are in the milliseconds to complete. Of course, based on your server configs, you may limit something like 100 requests / sec, which is an artificial boundary (and queue) to keep resource consumption minimized and allow the machine to do other things than just handle requests. If your requests are taking longer than 1s to complete, I would question what you are doing, and see if there is a way to optimize those requests, or create some sort of data cache to limit the impact of those long running requests.)

If you have a process that is hogging the connection - such as a flow like: http req > process some long running data crunch > output, then you may want to look into asynchronous calls on your front end, and let the server manage the work load and return when it's done. In this scenario, if the server is locking due to these requests, you need to let your code handle concurrent requests to shared resources (look into spawning new processes or threads to handle new requests, etc...) - if, for any reason, that is not possible, then you have to start looking at hardware solutions to your problem.

However, if you are making persistent connections (aka, websockets), then you may need to consider how you are scaling out with these connections - instead of a single entry point, break apart processes on need / purpose, and consider if you can handle certain requests differently. This is the most likely scenario for needing a hardware upgrade as it literally hogs a resource and holds on to it until released. Finite resources means a requirement to expand.

In my opinion, throwing hardware at a problem should be the last resort. It's expensive in terms of up front cost, and likely there is a programmatic way to optimize for the existing hardware - until there isnt (some times, crunching lots of data just takes time). Of course, YMMV and others may want to chime in with their experience, as there are some people who have been at this stuff longer than I've been alive :-P

Hope that helps,

Ryan

Got it :)

Thank you guys for your responses, DaniWeb users are really helpful :)

Farris

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.