I have an app that gets data out of the database. This data is updated via a separate application. At the moment the first application has to check every x seconds if there is any new data. I seem to remember reading a while ago something about subscribing to data changes or something so that instead of it being a pull from sql server, sql server actually notifies the app that there is new data and then the app can go get it, and therefore doesn't have to check itself.
Is that right, or did I dream that?