Hi,

I'm working on application, that is similar to hotsanic or munin, cacti (it is system and network graphing tool)
I want to store data to postgreSQL.

I have a group of servers and a set of plugins. On every server run a diferent subset of plugins.
Every plugin return in time loop two number: time and value. (For example at 9:00:05 is 80% disk space free.)

Database size will very huge with lots of inserts, because
number of servers: 200 and number of plugins: 200
values for one server and one plugin in a few year about 500.000
(for last 14 days are stored every 5 sec.)
total: 20.000.000.000 values

Data are read-only.
What is the best way to store this data?
I think data model.

Thank you for any help.
Im sorry for my english.

Best regards
Marek Fiala

I recommend daily tables. For applications like these, I use a template table called something like lotsofdata_template and individual per-day tables called lotsofdata_[i]YYYYMMDD[/i] , e.g. lotsofdata_20080729 for today. It has a cost in the complexity of the code that deals with the tables, since if you're concerned about more than one day at a time you have to query several tables, but it keeps individual table sizes at least somewhat manageable, and makes the process of archiving data that you no longer need to have actively available infinitely simpler and less prone to catastrophe.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.