We had an issue with our load balancer yesterday which resulted in one of our six web servers sending 404 responses for all requests.

Obviously traffic was way down during the outage, but there was no significant recovery after the outage ended. I'm worried that googlebot came to crawl us and our servers essentially said, "Nope, sorry, those pages don't exist here," and googlebot, with no reason to think our servers were lying, deindexed us and/or stopped sending traffic to any page it got a 404 from.

What can I do to remedy the situation before it escalates further?

Are you not able to get this sort of information from webmaster tools? perhaps request an increase to your crawl rate for a few days?

GWT is always a few days behind, so it hasn't showed up there yet.

The problem with crawl rate is that, like it or not, Googlebot received mixed signals from the site, so now they are less likely to "trust" that site moving forward, me thinks.

In GWT if you wait for the page errors to come up you can remove them all and Google will automatically re-crawl the site. This should negate any errors ^^