I am running a PHP-based intranet site where the main page has to refresh its view every minute in order to show the most up-to-date information from an MSSQL database. At any one time there are only about 20 entries that need to be shown, but the main database contains thousands of historical records. My question is, and I apologise if it is a daft question, is it OK to keep querying the database like this, or should I have a separate table that only contains the active entries, then transfer them to the main table when they are done with?

I use the latter solution currently, but want to redesign the system, using the most appropriate methods.

Thanks for your help.

I don't do MSSQL, but for other databases, it is perfectly ok to keep all the data in one (set of) table(s) and simply run the query for the most recent N rows (or for all the rows within a time range, or however you prefer), as long as there is an index on the time stamp column. This will work nicely up to about a million rows or so, after which you may want to partition the table(s) based on the timestamp. Don't know if MSSQL has the capability to do the partition thing on the fly (Postgres does, for instance); but if not, you can take the database down periodically (Sunday morning?) and shove rows out of the current table into the history table. That should take only a few seconds if you do it often enough. Be careful to keep foreign keys correct if you are using them.

Thank you very much for your reply. I think I will redisgn it as you suggest.
Happy new year to you!