I am not always a fan of removing content from my sites once the pages have been spidered. If the contnet is thin, then improve it. Add layers of complexity to a page.
You can start also by investigating the Google PR of each of your pages. Notice what is a PR0 or not even ranked. Start with those. What are some creative things you could do to improve each of those pages' value?
Have an unique content and if there are any spammy links present in your website contact the webmaster of that particular site and if they do not reply then disavow the links. Don't do link building in low quality sites.
You need to know that Google Panda is now a domain level penalty. What I mean is, if earlier you’ve had low quality content, it was your content that was penalized, but now if you have low quality content, your complete domain will be penalized. One of the simplest ways to get rid of Panda is by getting rid of low quality content, increase site traffic by good means and improve SEO.
Google Panda has been a domain level penalty ever since it came out in February 2011, so that's not really anything new with Panda 4.0.
But, yeah, like Albert says, get rid of low quality content. The problem, as it's been explained recently already elseweb, is that Panda aims to kill off thin content and duplicate content. Duplicate content is easy to understand. Google's definition of thin content, unfortunately, not so much. For example, what specifically constitutes thin content? Where's the line drawn in the sand? How do you know if your web pages are culpreit?