Losses due to computer break-ins by malicious outsiders or disgruntled employees are estimated to cost companies billions of dollars each year (Taylor 1999), but the detection and prosecution of these intruders is difficult because companies subjected to security break-ins are often oblivious to such damage, while others who do detect breaches of their information systems choose not to disclose such events, fearing the commercial damage associated with loss of consumer confidence once weaknesses in corporate computer systems are revealed.
Behar (1997) notes that security issues have come into sharper focus as greater corporate dependence on e-mail and networks has been matched by increasing amounts of economic espionage from ‘crackers’, that is, malicious intruders. This increased vulnerability to attack has been paralleled by government requirements that companies take responsibility for keeping their own data secure, and be held liable for losses incurred by ‘downstream’ companies damaged by the host company’s lax computer security.
Some former hackers from the heyday of thrill-seeking computer break-ins are now assisting system operators to establish and maintain sound security practices by testing system vulnerability with their own specialised knowledge, thus helping to foil the activities of the malicious, criminal ‘crackers’ (Sprenger 2000).
Taylor (1999) believes this hacker expertise is needed because a knowledge gap has developed in the security industry as computer programming has moved from a craft-based practice, (typically that of the early creative, if less-than-disciplined hackers), to a scientific reliance on standardised procedures. Deficiencies in computer security reveal problems caused by a similar change in computing education, reflected in an under-valuing of practical knowledge of system vulnerability gained by experimental, hands-on experience. Taylor argues that crackers take advantage of this situation and simply use trivial holes left by inexperienced programmers who have not been taught the limitations of system security.
But the use of hackers by the security industry begs the ethical question of whether hackers who have developed their skill by breaking into organizational systems should now be used for the rightful purpose of strengthening computer security.
However, it could be argued that these former hackers display an ethos, not imposed by professional codes of conduct, but one based on an intrinsic set of values and beliefs, inspired by an inherent respect for computers and the information computers contain, which is accompanied by an abhorence for those who do not share this respect. The practical application of this ethos is displayed, when, for no apparent pecuniary gain, hackers have spent considerable time in obsessively tracking down malicious intruders and bringing them to account for the damage they have caused, not only to organizations, but to the ethos of the former hacking fraternity (Stoll 1991; Shimomura 1996).
Developing ethical sensitivity in a community of practice
Roush (1995) cites the hacker, ‘Knightmare’, who in his book, Secrets of a Super Hacker, defines an ethos of mature, ‘responsible’ hacking, explained as ‘never harming any computer, software, system or person, nor profiting from a “hack”, but instead informing computer managers of their systems’ weaknesses’. Thus a ‘true hacker’ has ‘the ability to steal money, information, software, and hardware and to commit sabotage and espionage, but chooses to do none of these things’ (Roush 1995:35).
This passion for their computing craft in a hacker community of elitist computing skill was accompanied by a contempt for government and corporate computer systems which, in the hackers’ view, constituted a misuse of information technology by contradicting the constitutional rights of citizens with respect to the freedom of information. Thus the hacker ethos has reflected both an intrinsic desire to preserve and extend its own conduct and expertise, and an external motivation to attack and expose the vulnerability of institutionalised computer data systems.
Blum (1994: 146) discusses the relationship between virtue and community in the writings of MacIntyre (1984: 194) who states ‘the essential function of the virtues is clear. Without them, without justice, courage and truthfulness, practices could not resist the corrupting power of institutions’ (1984: 194). Blum argues that virtues can only be learned and sustained in a community of practice. Thus, the ethics of virtue are not an alternative, but are complementary to the ethics of universality, that is, while based upon universal ethical principles, the commonly held views of the practice community are indigenous and characteristic to its particular activities.
Furthermore Blum argues, ‘a practice, like a profession, is characterized as much by the way its participants conduct themselves as in the skills they develop and the purposes to which they are committed’. Therefore, if hacking is perceived by hackers as an elite practice, with internal goals and standards which are pursued in a moral way for their own sake, then a member of the hacking practice-community would be expected only ever to apply their elite technical expertise to responsible hacking, and not to malicious cracking. Members would also be required to track down and expose deviates (crackers), who by their behaviour were damaging both the integrity of the practice-community, and the wider society.
Such virtuous hacking stems from an earlier craft-like bricoleur approach to computer programming. This holistic approach to computing explains hacking’s lasting appeal to subsequent generations and is the strongest reason why hacking is likely to survive in some form or another, even as programming develops towards more science-based methods (Taylor 1999:88).
There are parallels here in the training of artisans in the guilds of pre-industrial times, a training embedded in the craftsman/apprentice relationship which encouraged not only a transfer of skills but also the development and then guardianship of the ethos of the craft for future generations.
Thus, contemporary computing ‘apprentices’ should be assisted in developing a moral sensitivity for the systems they use and the safety of the information which computers store, at the same time that computing expertise develops. This skills/ethics dichotomy presents a strong argument for an integrated approach to the teaching of computing ethics in mainstream computing subjects (Roberts 1994; Roberts & Webber 1999) so that computing skill development is matched by the development of a moral sensitivity for the social responsibility that skill entails.
Behar, R. (1997) ‘Who’s reading your e-mail’. Time, February 3, 64-67.
Blum, L.A. (1994) Moral Perception and Particularity. Cambridge: Cambridge University Press.
MacIntyre, A.C. (1984) After Virtue: A Study in Moral Theory, 2nd Edn. Notre Dame: Notre Dame University Press.
Roberts, P.M. (1994) ‘The place and pedagogy of teaching ethics in the computing curriculum’ Australian Educational Computing, April.
Roberts, P.M. & Webber, J. (1999) ‘Visual Truth in the Digital Age: Towards a Protocol for Image Ethics’ Australian Computer Journal, 31, 3:78-82.
Roush, W. (1995) ‘Hackers: Taking a byte out of computer crime’. Technology Review, April: 32-40.
Shimomura, T. (with Markoff, J.) (1996) Take-Down: The Pursuit and Capture of Kevin Mitnick, America’s Most Wanted Computer Outlaw – By the Man Who Did It. New York: Hyperion.
Sprenger, P. (2000) ‘Tiger teammates, hacking bright’ Information Age, August/September: 32.
Stoll, C. (1991) The Cuckoo’s Egg. New York: Doubleday.
Taylor, P.A. (1999) Hackers: Crime in the Digital Sublime. London & New York: Routledge.