In the past decade, computer and networking technology has seen enormous growth.This growth however, has not come without a price. With the advent of the"Information Highway", as it’s coined, a new methodology in crimehas been created. Electronic crime has been responsible for some of the mostfinancially devastating victimizations in society.

In the recent past, societyhas seen malicious editing of the Justice Department web page (1), unauthorizedaccess into classified government computer files, phone card and credit cardfraud, and electronic embezzlement. All these crimes are committed in the nameof "free speech." These new breed of criminals claim that informationshould not be suppressed or protected and that the crimes they commit are reallynot crimes at all. What they choose to deny is that the nature of their actionsare slowly consuming the fabric of our country’s moral and ethical trust inthe information age. Federal law enforcement agencies, as well as commercialcomputer companies, have been scrambling around in an attempt to"educate" the public on how to prevent computer crime from happeningto them. They inform us whenever there is an attack, provide us with mostlyineffective anti-virus software, and we are left feeling isolated andvulnerable.

I do not feel that this defensive posture is effective because it isnot pro-active. Society is still being attacked by highly skilled computercriminals of which we know very little about them, their motives, and theirtools of the trade. Therefore, to be effective in defense, we must understandhow these attacks take place from a technical stand-point. To some degree, wemust learn to become a computer criminal.

Then we will be in a better positionto defend against these victimizations that affect us on both the financial andemotional level. In this paper, we will explore these areas of which we know solittle, and will also see that computers are really extensions of people. Anattack on a computer’s vulnerabilities are really an attack on peoples’vulnerabilities. Today, computer systems are under attack from a multitude ofsources.

These range from malicious code, such as viruses and worms, to humanthreats, such as hackers and phone "phreaks." These attacks targetdifferent characteristics of a system. This leads to the possibility that aparticular system is more susceptible to certain kinds of attacks. Maliciouscode, such as viruses and worms, attack a system in one of two ways, eitherinternally or externally.

Traditionally, the virus has been an internal threat(an attack from within the company), while the worm, to a large extent, has beena threat from an external source (a person attacking from the outside via modemor connecting network). Human threats are perpetrated by individuals or groupsof individuals that attempt to penetrate systems through computer networks,public switched telephone networks or other sources. These attacks generallytarget known security vulnerabilities of systems. Many of these vulnerabilitiesare simply due to configuration errors.

Malicious Code Viruses and worms arerelated classes of malicious code; as a result they are often confused. Bothshare the primary objective of replication. However, they are distinctlydifferent with respect to the techniques they use and their host systemrequirements. This distinction is due to the disjoint sets of host systems theyattack.

Viruses have been almost exclusively restricted to personal computers,while worms have attacked only multi-user systems. A careful examination of thehistories of viruses and worms can highlight the differences and similaritiesbetween these classes of malicious code. The characteristics shown by thesehistories can be used to explain the differences between the environments inwhich they are found. Viruses and worms have very different functionalrequirements; currently no class of systems simultaneously meets the needs ofboth.

A review of the development of personal computers and multi-taskingworkstations will show that the gap in functionality between these classes ofsystems is narrowing rapidly. In the future, a single system may meet all of therequirements necessary to support both worms and viruses. This implies thatworms and viruses may begin to appear in new classes of systems. A knowledge ofthe histories of viruses and worms may make it possible to predict how maliciouscode will cause problems in the future.

Basic Definitions To provide a basis forfurther discussion, the following definitions will be used throughout thereport; Trojan Horse - a program which performs a useful function, but alsoperforms an unexpected action as well; Virus - a code segment which replicatesby attaching copies to existing executables; Worm - a program which replicatesitself and causes execution of the new copy and Network Worm - a worm whichcopies itself to another system by using common network facilities, and causesexecution of the copy on that system. In essence, a computer program which hasbeen infected by a virus has been converted into a "trojan horse". Theprogram is expected to perform a useful function, but has the unintended sideeffect of viral code execution. In addition to performing the unintended task,the virus also performs the function of replication.

Upon execution, the virusattempts to replicate and "attach" itself to another program. It isthe unexpected and uncontrollable replication that makes viruses so dangerous.As a result, the host or victim computer falls prey to an unlimited amount ofdamage by the virus, before anyone realizes what has happened. Viruses arecurrently designed to attack single platforms. A platform is defined as thecombination of hardware and the most prevalent operating system for thathardware. As an example, a virus can be referred to as an IBM-PC virus,referring to the hardware, or a DOS virus, referring to the operating system.

"Clones" of systems are also included with the original platform.History of Viruses The term "computer virus" was formally defined byFred Cohen in 1983, while he performed academic experiments on a DigitalEquipment Corporation VAX system. Viruses are classified as being one of twotypes: research or "in the wild." A research virus is one that hasbeen written for research or study purposes and has received almost nodistribution to the public. On the other hand, viruses which have been seen withany regularity are termed "in the wild." The first computer viruseswere developed in the early 1980s.

The first viruses found in the wild wereApple II viruses, such as Elk Cloner, which was reported in 1981 [Den90].Viruses were found on the following platforms: Apple II IBM PC Macintosh AtariAmiga These computers made up a large percentage of the computers sold to thepublic at that time. As a result, many people fell prey to the Elk Cloner andvirus’s similar in nature. People suffered losses in data from personaldocuments to financial business data with little or no protection or recourse.Viruses have "evolved" over the years due to efforts by their authorsto make the code more difficult to detect, disassemble, and eradicate.

Thisevolution has been especially apparent in the IBM PC viruses; since there aremore distinct viruses known for the DOS operating system than any other. Thefirst IBM-PC virus appeared in 1986 [Den90]; this was the Brain virus. Brain wasa boot sector virus and remained resident in the computer until "cleanedout". In 1987, Brain was followed by Alameda (Yale), Cascade, Jerusalem,Lehigh, and Miami (South African Friday the 13th). These viruses expanded thetarget executables to include COM and EXE files. Cascade was encrypted to deterdisassembly and detection.

Variable encryption appeared in 1989 with the 1260virus. Stealth viruses, which employ various techniques to avoid detection, alsofirst appeared in 1989, such as Zero Bug, Dark Avenger and Frodo (4096 or 4K).In 1990, self-modifying viruses, such as Whale were introduced. The year 1991brought the GP1 virus, which is "network-sensitive" and attempts tosteal Novell NetWare passwords. Since their inception, viruses have becomeincreasingly complex and equally destructive. Examples from the IBM-PC family ofviruses indicate that the most commonly detected viruses vary according tocontinent, but Stoned, Brain, Cascade, and members of the Jerusalem family, havespread widely and continue to appear.

This implies that highly survivableviruses tend to be benign, replicate many times before activation, or aresomewhat innovative, utilizing some technique never used before in a virus.Personal computer viruses exploit the lack of effective access controls in thesesystems. The viruses modify files and even the operating system itself. Theseare "legal" actions within the context of the operating system.

Whilemore stringent controls are in place on multi-tasking, multi-user operatingsystems (LAN Networks or Unix), configuration errors, and security holes(security bugs) make viruses on these systems more than theoretically possible.This leads to the following initial conclusions: Viruses exploit weaknesses inoperating system controls and human patterns of system use/misuse; Destructiveviruses are more likely to be eradicated and An innovative virus may have alarger initial window to propagate before it is discovered and the"average" anti-viral product is modified to detect or eradicate it. Ifwe reject the hypothesis that viruses do not exist on multi-user systems becausethey are too difficult to write, what reasons could exist? Perhaps the explosionof PC viruses (as opposed to other personal computer systems) can provide aclue. The population of PCS and PC compatible is by far the largest.

Additionally, personal computer users exchange disks frequently. Exchangingdisks is not required if the systems are all connected to a network. In thiscase large numbers of systems may be infected through the use of shared networkresources. One of the primary reasons that viruses have not been observed onmulti-user systems is that administrators of these systems are more likely toexchange source code rather than executables. They tend to be more protective ofcopyrighted materials, so they exchange locally developed or public domainsoftware.

It is more convenient to exchange source code, since differences inhardware architecture may preclude exchanging executables. It is this type ofattitude towards network security that could be viewed as victim precipitation.The network administrators place in a position to be attacked, despite the factthat they are unaware of the activity. The following additional conclusions canbe made: To spread, viruses require a large population of similar systems andexchange of executable software; Destructive viruses are more likely to beeradicated; An innovative virus may have a larger initial window to propagatebefore it is discovered and the "average" anti-viral product ismodified to detect or eradicate it.