This article appeared in Business and Technology, January 1999, page 22-27. It is republished here with the permission of Reed Business Information.
© Reed Business Information, 1999 (hypertext links added by CCSR)
Illustration by David Newton
We live in an age where decisions are made by the discreet whirr of a CD-Rom drive, and where human beings are fired at the flick of a wrist on a mousemat. Technology, perhaps more egalitarian and democratic than ever before, also carries with it unprecedented means for the (literally) systematic abuse of human rights.
Just the stroke of a key alerts employers to our diligence; genetic fingerprinting contains information about us so intimate that even we ourselves are not aware of it, while the desire of supermarkets to stack the shelves of their data warehouses means that we really are what we eat, wear and sleep in - thanks to our loyalty cards.
But who cares? Simon Rogerson of Leicester’s De Montfort University does. With the aspect of Rodin’s Thinker, he looks down on the technology-led workings of British business and muses on their ethical implications.
Rogerson, who describes himself as “positive, entrepreneurial, socially aware, inclusive and humourous”, is director of the Centre for Computing and Social Responsibility, which makes him the UK’s first professor in computer ethics. Now 47, he is also a graduate of computational science from the University of Dundee.
“People might be surprised that a professor in computer ethics has a non-philosophical background and began his working life writing Fortran and Assembler programs,” he says. Whether they’re surprised or not, Rogerson clearly has a credible foundation for what cynics might see as an airy discipline.
According to his CV, he followed a “successful industrial career” - with Thorn EMI, among others - by “combining research, lecturing and consultancy in the management, organisational and ethical aspects of information and communication technologies” (ICT). Which is what he does today.
“As computer technology advanced, people started to be aware of the pitfalls that threatened to undermine the benefits of this powerful resource,” says Rogerson in his East Anglian burr (he was born in Norwich, home to Alan Partridge, a second-rate football team and the odd historical monument).
But how can a classical model be applied to a 20th-century phenomenon?
Rogerson is studying an entirely new set of problems, he explains, such as fraud and computer-generated human disasters, and the ways in which they present new versions of standard ethical dilemmas. In academic terms, this means embracing concepts, theories and procedures from philosophy, sociology, law and psychology as well as computer science and information systems.
“The overall goal is to integrate computing technology and human values,” he says, “in such a way that the technology advances and protects human values, rather than damages them.”
And technology advances at an astonishing rate: from back room to living room, via the desktop and TV. Five years ago, he points out, if someone had said they could shake hands with somebody 1,000 miles away, people would have laughed. Today the technology exists to do exactly that, but legislation has yet to come to terms with how people can interact from opposite sides of the world.
Does this suggest there is something about technology that tempts businesses away from a shared ethical foundation? Rogerson disagrees: “It’s not that ethics are being put aside when it comes to ICT, it’s just that they are only now being recognised by business as an issue.
“Computing crosses cultural, religious, political and economic boundaries and, as such, challenges our social norms. But there are several core values that are common, such as knowledge, freedom and impartiality, on which a universal ethical code for ICT can be founded.
“There are so-called ‘policy vacuums’ created by technology which lead to ethical dilemmas,” admits Rogerson. “Some may arise from new twists on old problems.”
Such as? “Privacy is a good example. Ever since civilisation began, the right to privacy has been a philosophical issue.
“In a recent speech at the Business Link annual conference, Peter Mandelson said: ‘The key to our future competitive success lies more and more in the exploitation of knowledge for commercial, profitable ends.’ The concern is how this will be translated into organisational strategy and action.”
Another, surely, is that capitalism relies on competition to supply people’s needs. But market forces are reductive, and if the market merely regards us, say, as ABC1s who shop at Sainsbury’s, then the commercial exploitation of that knowledge could mean nothing less than the in-depth exploitation of people. It’s as simple as ABC.
“Knowledge resides with people, and I can imagine an organisational world where people are cherished, encouraged and justly rewarded so they remain content and loyal.
“But I can also imagine one where people are sucked dry of their knowledge so it can be retained in an ICT system, while they are discarded as another spent resource.”
As businesses are discovering, fraud detection lends itself to data-matching systems that need little or no human intervention, and commercial pressures mean that the use of such systems will grow. (As is the case with most technologies: witness biotechnologists’ pressure on the Government to decide its ethical stance on human cloning so the industry can plough ahead and make money.)
Today, many financial systems are wholly automated, and tens of thousands of people are now being laid off in the City as a consequence. What then, are the ethical repercussions of business systems that ‘mine’ for knowledge or ‘predict’ patterns of human behaviour from a database of existing information?
“There is a difference between the methods of detection used in the past and today’s data matching. Traditional investigation is triggered by some evidence of wrong-doing by an individual, such as tax evasion or bogus benefit claims. Data matching, however, isn’t targeted at individuals, but at entire categories of people. It isn’t initiated by any suspicion about an individual, but because the profile of a particular group is of interest. The data-matching process reverses the assumption of innocence.
“Data held legitimately in the public domain should be allowed to be traded. But bringing together private, personal data using, for example, an automated inference model [such as software that predicts you might commit fraud] may create false knowledge which, when traded, results in harm to the individual. That mustn’t be permitted.”
The professor claims that, when it comes to computing and business, morals and ethics are interchangeable. Morals originate from the group in which a person matures, and business ethics, by implication, from the group in which your business matures. This is why, he says, organisations must promote ethical practices to compensate for the ‘policy vacuums’ he talks about. In other words, to create a shared ethical culture among their peers.
In the business world, of course, employers often argue that electronic monitoring deters fraud, industrial espionage and other illegal activities, but this does not, says Rogerson, give them a universal right to monitor their employees. The civil liberties of innocent people should not be suppressed because a few rogue employees might abuse them.
In the light of all this, it’s easy to see why De Montfort set up the Centre for Computing and Social Responsibility in 1995. It has generated enormous interest: the centre’s Web site [www.ccsr.cse.dmu.ac.uk] notched up nearly 20,000 hits on the day we spoke in early December, and recorded 275,000 visits for the whole of November. But what persuaded the University that Rogerson was the right man for the chair?
“Computer ethics must be practically relevant,” he says. “So my background in computing makes me credible in the eyes of the practising IT user.”
With the groves of academe themselves subject to ever-greater market forces since the Thatcher era, who is sponsoring Rogerson’s research?
“Our work involves many of the leading organisations in the UK including the Royal Mail, BT, IBM, Transco, the Institute of Business Ethics, the British Computer Society and the office of the Information Commissioner.
“Any organisation using ICT should be interested in our work and, more importantly, should be committed to the concept of computer ethics. I would like to see a cross section of organisations working with us, representing various sectors and coming from different cultures and countries.”
Many of these organisations, he says, participate in workshops and research agendas towards social responsibility in the electronic age, the idea being to create something of practical benefit to the world at large, rather than mere theorising.
To this end, the Economic and Social Research Council have funded a research project with Rogerson’s department. Is IT Ethical? The 1998 Ethicomp Survey of Professional Practice, published next month, looks at the ethical perceptions of information systems managers. It reveals an IT community split along age lines in its response to ethical questions, and the results will be profiled in the next issue of B&T.
One area of particular interest to Rogerson is the subject of our very identities in the digital age. We have become, he says, composite beings with electronic personas, which present new problems of identification. “We run the risk of creating a two-tier society: the ‘citizens’, and a second group [non-citizens?] who don’t have access to the digital society or any of its services.
“As I see it, there are now three elements to us as individuals: the physical, the philosophical and the digital. In order to gain access to services, we need an electronic identity, which includes our smartcards, loyalty cards, fingerprints, iris scans and digital signatures. All this verifies our ‘authenticity’, if you like; that we are who we say we are.”
So do the innocent, to paraphrase former home secretary Michael Howard, have “nothing to fear”? Surely a national ID smartcard is the logical conclusion of technological progress? [A Government-backed private firm is introducing voluntary ID photocards for teenagers this year.]
“I remember Howard saying that if the last Government had its way, it would have enforced multifaceted, machine-readable cards on every citizen,” sighs Rogerson. “But part of the ethical code of a digital society ought to be the right not to participate.
“I have grave concerns about ID cards. They may be technically and economically prudent, but the social costs are too high a price to pay. People will come to accept that their card is their identity. It removes the right to anonymity.
“I believe that, far from reducing crime, ID cards would actually give criminals the opportunity to falsify them or use them in bogus situations. People would accept their ID without question.”
But Rogerson still believes it’s important for IT to enable effective interaction between Government, citizens and business. “A unified approach that alleviates duplication, contradiction and ambiguity while protecting individual rights is a laudable concept,” he maintains.
“Government would become more aware of local opinion, while at the same time individuals would be made more aware of their rights and the activities and requirements of Government.
“But the concept of online government implies literacy, and an awareness and acceptance of technology. That isn’t the case in practice. The ethical principles of education and awareness should be introduced to recognise this.”
Rogerson foresees many benefits in online government, providing the pitfalls are avoided. For example, the vast investment it would entail would starve more traditional forms of their funding, and online government could reduce social interaction.
So, what advice does the professor have, not just for the Blairs and Mandelsons of the world, but B&T’s readers too: the UK’s IT directors and strategists?
“The impact of IT is usually judged in terms of whether the planned gains in efficiency and effectiveness are realised,” he says. “But that isn’t everything.”
“My advice would be to consider who is affected by your work; examine if others are being treated with respect; consider how the public would view your decisions and actions; analyse how the least empowered will be affected by your decisions, and consider if your actions are worthy of a model ICT professional.
“Good ethics mean good business.”
|The writing on the wall
Some of what professor Rogerson says is echoed in the writings of a number of economists and management theorists, from Will Hutton, author of The State We’re In, to Charles Handy who wrote The Empty Raincoat.
New Labour’s post-election talk of the ‘stakeholder economy’, and the public pronouncements of some of our more prominent corporations suggest that, as a PR exercise at least, today’s business world recognises the need for a more humanitarian form of capitalism via technology.
But the domino effect of collapsing economies in the Far East - felt here in the tens of thousands of City redundancies, among others - has ironically prompted financier George Soros to identify a deep-seated malaise in the capitalist system itself.
“There is worse to come,” Soros predicts in his book, The Crisis of Global Capitalism. “It’s difficult to escape the fact that the system itself was the main reason for the crisis.
“I can already discern the makings of the final crisis,” he continues.
“Indigenous political movements are likely to arise that will seek to expropriate
multinational companies and recapture the ‘national’ wealth. Some of them
may succeed in the manner of the Boxer Rebellion or the Zapata Revolution.”