|
|
||||||||||||||
|
|
|
|
The Computer Virus Continuum
A Brief History As the personal computer gained prominence in the early 1980's, so did the computer virus. Several theories exist as to the exact date and origin of the first virus to exist in the wild.1 By the latter part of the decade, several had gained a foothold in virus history, most notably the Brain virus (1987) and Michelangelo (1988). Today, over 40,000 computer viruses are in existence. With such a proliferation, lines have begun to blur as to what constitutes a computer virus. To explain this distinction, a brief description of the types of threats follows:
As is the case of medical viruses, there is no "cure" for computer viruses and the key lies in prevention. Because the nature of viruses, their transmission, and frequency are constantly changing, the preventative methods must be equally dynamic. For example, it was once adequate protection to simply search for a specific sequence of code, called a signature, to detect a virus. Anti-virus firms routinely updated signature files and users updated their systems to include the newest definitions. However, the virus engineers began creating viruses that mutated when replicating. Some of the mutations were encrypted. This created a need to include decryption algorithms with the signature files. While the anti-virus vendors busied themselves researching and combating encrypted viruses, the virus engineers were equally busy creating a new breed of polymorphic viruses which randomized the encryption routines. Obviously, with today's sophisticated virus threat, anti-virus detection must encompass the dynamic ability to detect based on even unknown behavior. This parallel has not been achieved in the medical community, where only viruses with known patterns of infection can be prevented. Indeed, even in the computer arena, few companies are able to successfully achieve a high standard of detection against mutating viruses, much less defend against newly created viruses. The Threat Today
First, we must understand that we are entering a new era of computer viruses. Initially spread via floppy diskette, virus infections did not easily become widespread. A year might pass before the virus could be considered prevalent. In the 90's, macro viruses were introduced which exploited Microsoft Word and were able to spread with relative ease through shared documents. Even so, it took a month or two before the virus achieved significant prevalence. This next generation of viruses, as seen with the Melissa virus, exploits the connectivity of the Internet and needs only a few days to establish widespread infection. Obviously, anti-virus firms cannot possibly manage this threat using standard methods of signature scanning for detection. With a vehicle as fast as the Internet, signature based scanning would be akin to placing a band-aid on a severe wound. It simply would not be able to stop the flow of infection. Worse, this method is only useful for viruses that are already known to the anti-virus industry. Melissa was a brand new virus. To counteract in this sophisticated arena, the anti-virus engine must be able to intelligently make decisions regarding the behavior of a file. This element, called heuristic scanning, is key to successful detection. Not all heuristic scanning is created equal. Both Network Associates' McAfee VirusScan and Symantec's Norton AntiVirus provide heuristic scanning, yet both of these products failed to detect the Melissa virus. Even with updated signature files to detect Melissa after her initial occurrence, subsequent variants of the same virus were not detected by these scanners. In fact, only one commercially available scanner detected the Melissa virus based solely on powerful heuristics - Command AntiVirus.
Prevention is the Key The Future According to the ICSA survey, all indications point to a worsening of the virus epidemic. The current 40,000 plus viruses would not only increase to 80,000 within a year, but more sophisticated virus engineering techniques and subsequent expedient delivery via the Internet could signal catastrophe for improperly protected systems. Assuming infection incidents increased proportionately, the median rate for infection would increase to 176 per 1000 systems per month, or stated differently, administrators could be faced with 18% of their systems infected monthly within the next year. Without proper detection, within two years, the rate would rise to a crippling 36%. Even at the lower projected rate of $1,750 per incident, this manifests a cost so prohibitive it could easily force a highly infected enterprise out of business entirely. Current market estimates place 89 million computers worldwide. Of these, less than 40% use anti-virus protection, despite the fact that, historically, anti-virus software places in the top five selling lists for retail outlets. Thus, 60% of the population will continue to unknowingly host and transmit viruses well into the future. Indeed this exemplifies another parallel with its medical counterpart - as long as a host is provided, viruses will continue to proliferate and exploit at the first weakness. Richard Pethia, director of CERT Coordination Center, summarized his testimony before Congress by stating: "Melissa represents a new form of virus that demonstrates how quickly an infection can spread across a network and hints at the kind of damage that could be done. Incident response organizations were able to limit Melissa's damage by working effectively together to analyze the problem, synthesize solutions, and alert the community to the need to take corrective action. With possible future viruses, it may not be possible to act as quickly or effectively. Response organizations will always have a role to play in identifying new threats and dealing with unprecedented problems, but response methods will not be able to react at Internet speeds with complicated viruses or with multiple, simultaneous attacks of different types." "The long-term solutions to the problems represented by Melissa will require fundamental changes to the way technology is developed, packaged, and used. It is critical that system operators and product developers recognize that their systems and products are now operating in hostile environments. Operators must demand, and developers must produce, products that are fit for use in this environment. As new forms of attack are identified and understood, developers must change their designs to protect systems and networks from these kinds of attack." 1 An "in the wild" virus is one that has been reported by multiple parties to have unknowingly infected their systems. It is differentiated from a zoo or laboratory virus which is contained for research purposes and may, for this purpose, be used to deliberately infect a particular system. 2 ICSA Labs 1999 Computer Virus Prevalence Survey
|
|
||||||||||||||||||||||||||||||||||||||||||||||