Everybody knows that running a forum is easy and hard.

The easy part is installing the software and seeing it work in your browser. Your database is small and it slowly grows during time as you setup your boards' categories and forums. Have more members and when they start to post.

The hard part is managing the board, your members and their posts. Keeping it active by providing content and optimizing your source code, features, options and settings.

The database can grow up to 100mb for a small site, or even above 1000mb for bigger sites and once you hit a million postcount you can be pretty sure your database is getting quite big, especially because of the attachment table.

So, how are you doing? What is your database size and how are you handling it? Are you still able to make a backup through the browser or are you running into errors and need phpmyadmin with gzip turned on; or even require mysqldump to do it properly.

The vBulletin Fan web site is nearing 100.000 posts and just hit 200 mb and attachments are currently stored inside the database.

Recommended Answers

All 7 Replies

My site has around 400,000 posts and 1620+ members the database is 60mb with gzip (without it is 220mb using invision power board), that is just the size of the forum database, the CMS is around 50 mb. I used to back up the database through admin cp but now its practically impossible as more often than not firefox crashes, now I use phpmyadmin, attatchments are stored inside the databse. As far as speed is concerned it is pretty decent alot of time but goes down considerably when the number of online users at one time go above 200.

Hey Scribbller. May I ask which site you run? The forum you have in your signature only has 65,000 posts. 400,000 is quite the big accomplishment! According to the vBulletin admin panel, the data usage is 320 megs, with 60 megs of attachments, 3 megs of profile pictures, and another 3 megs of avatars.

mysqldump (which I use religiously, set up in a cron job four times a day because you can never backup often enough) currently creates a 450 meg file. Web-based backups stopped working for me a long time ago.

Does your cron script only backup, or also archive?
I mean.. no use in having a backup of a hacked forum or corrupted database.
The backups I make are done like this:

siteurl_timestamps_dbname.gz

the 5th dump (oldest) gets removed if the last dump is successful.

It backs up every 4 hours, and archives a day's worth of backups.

Nice :)
Glad to year you won't be in trouble.
Backup can be an issue in a year from now when your site has doubled!

Although I do not use the same software, I have similar solutions.

Cron job tar/gzips all the web site, dumps SQL db into file which also gets tar/gziped once a day.
I periodically download both files (the web tar.gz and db tar.gz) to my local machine.

Once the site is big enough, I will increase the frequency of backups, and FTP the files to a remote location right after the cron job. Hopefully this will assist me in recovery after crashes whatever the reasons are. I will use the grandfather/father/son rotation.

Sounds like a good plan to me :)

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.