Your Peace of Mind is our Commitment

Contact Us English Recent Articles

Measuring and Reporting

The news constantly bombards us with stories of security disasters: destructive viruses, armies of hackers (an apology to people who know the difference between hackers and crackers, but the popular press has made its' definition), and worse. There was a recent Cyber-War between China and the USA (did you notice?). It would be easy to think that the Information Society is teetering on the brink of total failure from all these attacks.

However, in order to correctly plan our information security, we need something very different from sensationalised news reports. We have to quantify the various risks and threats in order to prioritise them for action. In many cases, the consequences of an incident are easy to evaluate - a new security vulnerability could expose that information, or a new virus could cause that damage.

The risks are a lot harder to quantify. How likely is it that your organisations' web page will be defaced? If another "Cyber-War" is reported, will that change? Should you immediately put your security staff on overtime in response? Which will cost more for your organisation, the latest big-name virus, or the one that has slowly but steadily been spreading and persistently causing small amounts of damage for months?

I suspect that the "Cyber-War" between USA and Chinese hackers was nothing more than a media event. Certainly, web pages were being defaced, but many had no political message. Even when the message was political, did the culprit do it because of the issue, or the theme was chosen simply because it was topical and likely to generate maximum publicity for the defacement? May was supposed to be the high point of the war, but a website specialising in statistics of website defacements (Attrition) shows the total in April was higher.

Similarly, W32.Leave.Worm gained recent publicity, but it is the older W32/Magistr@mm that I currently receive most often in email. W32/Magistr@mm also has destructive activation routines.

Obviously we cannot rely on the popular media for our risk assessments, we need hard data. There are many Internet sites that publish such information. I have already mentioned Attrition, for a more mainstream perspective, many CERT teams summarise their reports, including of course, the CERT Coordination Center (http://www.cert.org/nav/index_red.html). Information on virus spread is available from the WildList Organisation.

These sites depend on reports from organisations like yours. To quote from the CET/CC site, "Because users are our primary source of information, we encourage you to report any incidents you experience on your systems or any vulnerabilities you find. These reports will help us inform you and others about potential threats and ways to avoid or recover from them."

However, many information security incidents go unreported. A striking example of this was the case of the virus W95/CIH (also known as Chernobyl), which activated on 26 April 1999, causing damage on hundreds of thousands of PCs, largely in Asia. After the activation, the author, Chen Ing-Hau, was quickly identified by a Taiwanese college and questioned by military authorities. However, he was then released because, incredibly, there had been no complaints to the Police in Taiwan. It is unbelievable that absolutely no computers in Taiwan were affected when nearby Korea had, according to the Korean Information Security Agency, 160,000 activations, and Mainland China suffered 250,000 activations according to some newspapers. Thankfully, the following year, a Taiwanese college student filed charges when his machine was struck on the next anniversary, 26 April 2000. Chen Ing-Hau was arrested and if found guilty he could face up to three years in jail under destruction charges.

Therefore, good reporting from organisations like yours has positive benefits to information security in general. It lets us understand the size and nature of the problem, making realistic risk assessments a possibility, and, in some cases, it allows action to be taken against the culprits. This also applies within an organisation - what are your reporting mechanisms for information security incidents, and are they adequate? If you cannot measure it, you cannot manage it.If reporting is so important, why do so many incidents go unreported? Some possible reasons are:

The last is the reason that some organisations have a policy against making reports, however, organisations like the CERTs and WildList Organisation keep the victims' details confidential.

To be useful, the report must also be made to a relevant party - a police officer told me of a user reporting an email bomb to his local police station, and the desk sergeant called the bomb squad.

There are possible ways to collect these reports while avoiding the limitations. Two sites that give frequently updated statistics and do not depend on user reports are MessageLabs and Trend Micro's World Virus Tracking Center, both vendor sites (disclaimer: my company has a business relationship with both these companies). Trend's site displays statistics of viruses found by their free on-line virus scanner, and by their central management virus solution. In some ways, MessageLabs' statistics have a wider base - they record viruses sent in email to or from their customers. This gives an indication of what is happening not just on their customers systems, but anyone who emails those customers. However, this does not give the full picture, there is an inherent bias towards viruses that have an email replication mechanism. Conversely, a worm that uses other methods to spread such as Sadmind (which propagates on Solaris systems) will not get counted.

Sadmind also defaces websites on vulnerable IIS servers, so your webserver on NT is more likely to be attacked because of a worm spreading on Solaris. This illustrates the complexity of the factors that influence the risks in the real world, beyond the neat assumptions of the typical risk assessment.

Our current position is less than ideal: incidents go unreported; various data collection methods introduce their own biases, and the resulting information is not comparable between different summaries. The sites do agree on one thing: the trend is up. But the only way we will get better information is by making better reports more consistently.