Johns Hopkins CS 600 647 - Collaboration of Untrusting Peers with Changing Interests

Unformatted text preview:

Collaboration of Untrusting Peers withChanging InterestsExtended AbstractBaruch Awerbuch∗Boaz Patt-Shamir†David Peleg‡Mark Tuttle§AbstractElectronic commerce engines like eBay depend heav-ily on reputation systems to improve customer c onfi-dence that electronic transactions will be successful,and to limit the economic damage done by disrep-utable peers defrauding others. A reputation sys-tem a llows participants to post information aboutany particular transaction, and participants routinelycheck the reputation system before any transactionto avoid other participants with a bad history. Inthis paper, we introduce a framework for o ptimizingreputation systems for objects. We study reputationsystems in an asynchronous se tting, and in the con-text of re stricted access to the objects. Specifically,we study the cases where access may be restricted intime (objects arrive and depart from system) and inspace (each peer has access to only a subset of theobjects).∗Computer Science Department, Johns Hopkins University,3400 N. Charles Street, Baltimore, MD 21218. Research con-ducted while visiting CRL. Email: [email protected].†Cambridge Research Laboratory, HP Labs, One Cam-bridge Center, Cambridge, MA 02142. On leave from Tel AvivUniversity. Email: [email protected].‡Department of Computer Science and Applied Mathe-matics, The Weizmann Institute of Science, Rehovot 76100,Israel. Research conducted while visiting CRL. Email:[email protected].§Cambridge Research Labor atory, HP Labs, OneCambridge Center, Cambridge, MA 02142. Email:[email protected] to make digital or hard copies of all or part of this work forpersonal or classroom use is granted without fee provided that copies arenot made or distributed for profit or commercial advantage and that copiesbear this notice and the full citation on the first page. To copy otherwise, torepublish, to post on servers or to redistribute to lists, requires prior specificpermission and/or a fee.EC’04, May 17–20, 2004, New York, New York, USA.Copyright 2004 ACM 1-58113-711-0/04/0005 ...$5.001 IntroductionThe Internet clearly has a dar k side. The junk mailfilling our email folders is compelling evidence of thenumber of scam artists ready to take advantage of us,and the evening news shows discuss predators in chatrooms so frequently it is a wonder we communicatewith any strangers electronically. On the other hand,most people feel q uite comfortable buying a nd sellingthings over the web at eBay. This level of comfortwith eBay is quite remarkable, and flows directly fromeBay’s elaborate reputation system [12]: After everytransaction, the system invites each party to post itsrating of the transaction on a public billbo ard thesystem maintains. Consulting the billboard is a keystep before making a transaction.Nonetheless, the possibility of fraud is everywherewith on-line commerce. A frequently cited scenarioinvolves a group of sellers engaging in phony transac-tions, and r ating these transactions highly to gener-ate an appearance of reliability while ripping off otherpeople. Another scenario involves a single se ller be-having in responsible manner long enough to enticean unsus pecting buyer into a single large transaction,and then vanishing. Reputation systems are valuable,but not infallible.In this paper, we introduce the following simplemodel for reputations sys tems. In our model, thereare players and there are objects. Some of the ob-jects ar e good and some are bad, and a player canprobe an object at some cost to determine whetherthe object is g ood. The goal of the players is to finda good object with a s few probes as possible. Play-ers collab orate by posting the results of their probeson a public billboard, and consulting the board whenchoosing an object to probe. The problem is thatsome of the players are dishonest, and can behave inan arbitrary fashion, including colluding and postingfalse reports on the billboard to entice honest playersto probe bad objects.We assume that players are as ynchronous, and mayprobe objects at completely different rates. The ob-jects may enter and leave the system over time, orthey may be physically separated and inaccessible tosome of the peers. In other words, the players’ accessto objects may be restricted in both time and space.For example, imagine a manufacturing companybuying parts from a set of suppliers day after day.Some suppliers sell reliable par ts, and others sell de-fective parts. Some suppliers go out of business, andothers enter the mar ket. The manufacturer has tobuy parts from suppliers without knowing the actualquality of the parts until buying a few and examin-ing them. The supplier might offer refunds for defec-tive parts, but there is still a cost to buying defectiveparts and dealing with unreliable suppliers. Manu-facturers might even form an associa tion — a sort o fbetter business bureau — to keep track of suppliersand expe rience with supplier quality and supplier rec-ommendations, but even such associations are vulner-able to manipulation via malicious slander by mem-bers. The manufacturer wants to buy its pa rts withas few headaches as possible.As another example, consider manufacturing de-pending on “just-in-time” delivery of parts. In thiscase, physical proximity of the supplier co uld be im-portant. Suppliers between Boston and New Yorkmight be able to sell into both markets, and similarlyfor suppliers between New York and Philadelphia, butsuppliers near Boston are of no use to manufactur-ers near Philadelphia. New York might be willingto ta ke recommendations from Boston and Philadel-phia, but Boston might not find recommendationsfrom Philadelphia to be helpful.Contributions. In this paper, we make the follow-ing contributions.We pro po se several algorithms bas e d on a simplerule we call the Bal a n ced rule. There are severalapproaches a player might take to finding a good ob-ject. One strategy is to follow an “exploration rule”which says tha t a player should choose an object atrandom (uniformly) and probe it. This might be agood idea if ther e are a lot of good objects, or if thereare a lot of dishonest players posting inaccurate re-ports to the billboard. Another strategy is to followan “exploitation rule” which say that a player shouldchoose another player at random and pro be whicheverobject it recommends (if any), thereby exploiting orbenefiting from the effort the other player has alreadyput in to finding a good


View Full Document

Johns Hopkins CS 600 647 - Collaboration of Untrusting Peers with Changing Interests

Download Collaboration of Untrusting Peers with Changing Interests
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Collaboration of Untrusting Peers with Changing Interests and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Collaboration of Untrusting Peers with Changing Interests 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?