DOC PREVIEW
VANDERBILT HON 182 - Study Guide

This preview shows page 1 out of 4 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

18 1541-1672/06/$20.00 © 2006 IEEE IEEE INTELLIGENT SYSTEMSPublished by the IEEE Computer SocietyMachine EthicsThe Nature,Importance, and Difficulty of Machine EthicsJames H. Moor, Dartmouth CollegeThe question of whether machine ethics exists or might exist in the future is diffi-cult to answer if we can’t agree on what counts as machine ethics. Some mightargue that machine ethics obviously exists because humans are machines and humanshave ethics. Others could argue that machine ethics obviously doesn’t exist because ethics is simply emotional expression and machinescan’t have emotions.A wide range of positions on machine ethics arepossible, and a discussion of the issue could rapidlypropel us into deep and unsettled philosophicalissues. Perhaps, understandably, few in the scientificarena pursue the issue of machine ethics. You’reunlikely to find easily testable hypotheses in themurky waters of philosophy. But we can’t—andshouldn’t—avoid consideration of machine ethics intoday’s technological world.As we expand computers’decision-making roles inpractical matters, such as computers driving cars, eth-ical considerations are inevitable. Computer scien-tists and engineers must examine the possibilities formachine ethics because, knowingly or not, they’vealready engaged—or will soon engage—in someform of it. Before we can discuss possible imple-mentations of machine ethics, however, we need tobe clear about what we’re asserting or denying.Varieties of machine ethicsWhen people speak of technology and values,they’re often thinking of ethical values. But not allvalues are ethical. For example, practical, economic,and aesthetic values don’t necessarily draw on ethi-cal considerations. A product of technology, such asa new sailboat, might be practically durable, eco-nomically expensive, and aesthetically pleasing,absent consideration of any ethical values. We rou-tinely evaluate technology from these nonethical nor-mative viewpoints. Tool makers and users regularlyevaluate how well tools accomplish the purposes forwhich they were designed. With technology, all ofus—ethicists and engineers included—are involvedin evaluation processes requiring the selection andapplication of standards. In none of our professionalactivities can we retreat to a world of pure facts,devoid of subjective normative assessment.By its nature, computing technology is normative.We expect programs, when executed, to proceedtoward some objective—for example, to correctlycompute our income taxes or keep an airplane oncourse. Their intended purpose serves as a norm forevaluation—that is, we assess how well the computerprogram calculates the tax or guides the airplane.Viewing computers as technological agents is rea-sonable because they do jobs on our behalf. They’renormative agents in the limited sense that we canassess their performance in terms of how well theydo their assigned jobs.After we’ve worked with a technology for a while,the norms become second nature. But even afterthey’ve become widely accepted as the way of doingthe activity properly, we can have moments of real-ization and see a need to establish different kinds ofnorms. For instance, in the early days of computing,using double digits to designate years was the stan-dard and worked well. But, when the year 2000approached, programmers realized that this normneeded reassessment. Or consider a distinction involv-ing AI. In a November 1999 correspondence betweenHerbert Simon and Jacques Berleur,1Berleur wasasking Simon for his reflections on the 1956 Dart-mouth Summer Research Project on Artificial Intel-ligence, which Simon attended. Simon expressedImplementations ofmachine ethics mightbe possible insituations rangingfrom maintaininghospital records tooverseeing disasterrelief. But what ismachine ethics, andhow good can it be?some puzzlement as to why Trenchard More,a conference attendee, had so stronglyemphasized modal logics in his thesis. Simonthought about it and then wrote back toBerleur,My reply to you last evening left my mindnagged by the question of why Trench Moore[sic], in his thesis, placed so much emphasison modal logics. The answer, which I thoughtmight interest you, came to me when I awokethis morning. Viewed from a computing stand-point (that is, discovery of proofs rather thanverification), a standard logic is an indetermi-nate algorithm: it tells you what you MAYlegally do, but not what you OUGHT to do tofind a proof. Moore [sic] viewed his task asbuilding a modal logic of “oughts”—a strat-egy for search—on top of the standard logicof verification.Simon was articulating what he alreadyknew as one of the designers of the Logic The-orist, an early AI program. A theorem provermust not only generate a list of well-formedformulas but must also find a sequence ofwell-formed formulas constituting a proof.So, we need a procedure for doing this.Modal logic distinguishes between what’spermitted and what’s required. Of course,both are norms for the subject matter. Butnorms can have different levels of obligation,as Simon stresses through capitalization.Moreover, the norms he’s suggesting aren’tethical norms. A typical theorem prover is anormative agent but not an ethical one.Ethical-impact agentsYou can evaluate computing technologyin terms of not only design norms (that is,whether it’s doing its job appropriately) butalso ethical norms.For example, Wired magazine reported aninteresting example of applied computertechnology.2Qatar is an oil-rich country inthe Persian Gulf that’s friendly to and influ-enced by the West while remaining steepedin Islamic tradition. In Qatar, these culturaltraditions sometimes mix without incident—for example, women may wear Westernclothing or a full veil. And sometimes the cul-tures conflict, as illustrated by camel racing,a pastime of the region’s rich for centuries.Camel jockeys must be light—the lighter thejockey, the faster the camel. Camel ownersenslave very young boys from poorer coun-tries to ride the camels. Owners have histor-ically mistreated the young slaves, includinglimiting their food to keep them lightweight.The United Nations and the US State Depart-ment have objected to this human traffick-ing, leaving Qatar vulnerable to economicsanctions.The machine solution has been to developrobotic camel jockeys. The camel jockeys areabout two feet high and weigh 35 pounds.The robotic jockey’s right hand handles thewhip, and its left


View Full Document

VANDERBILT HON 182 - Study Guide

Download Study Guide
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Study Guide and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Study Guide 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?