November 01, 2003

Online Reputation Systems

Esther recently invited me to review a draft of the upcoming issue of the $795-a-year Release 1.0 newsletter, which for this issue would be all about online reputation systems. I noticed the issue just went out in the mail, so it's public now.

Here's the writeup I sent back to Esther earlier this month having read the draft (some of the ideas in here made it into the published article in one form or another):

Thanks, got the file, and have read it all the way through. Excellent overview of reputation systems and good selection of examples. I fear there's not much I can add that's not already been said and said well, but here are some random thoughts that came to mind while reading:

1) eBay Another dimension to eBay's feedback forum is revealed by examining what they *don't* provide to their users. For instance:

a. Lots of users have feedbacks containing say 98% or 99% positive ratings --- not quite perfect. People see that and immediately want to zero in on the imperfections --- the 1% or 2% that were negative. It's a natural reaction to have. Maybe you want to buy from this seller, but that 2% is scaring you. Were the negatives recent? How were the issues resolved? Problem is, it's often hard to find out easily, because eBay does not let you search or sort through feedback ratings. You can't tell eBay, "show me only this user's negative feedback." This restriction is intentional on eBay's part, and a source of eternal grumblings on the community's part. All you can do is page through the feedback ratings sequentially. For buyers or sellers with hundreds or thousands of ratings, this means many many pages! One consequence: after slogging through page after page of positive ratings looking for those curious negatives, all that positivity erodes your desire to keep looking. I bet a lot of users give up. (I believe this once again relates to eBay's architectural design based on the "people are basically good" philosophy you've also noted.)

b. Seller/buyer distinction. The community told eBay it wanted to distinguish between feedback given to a user who was buying something and feedback given to a user who was selling something. The idea being, a particular eBay user may be a great buyer but a lousy seller, or the other way around. So since that data was available, it was possible to add it to the feedback pages of the site. I advocated that users ought to be able tell eBay, "only show me feedback for when this user was selling something", but the engineering powers-that-be said it was too computationally expensive to implement (an excuse I've never bought, btw). So all I was able to add was the S/B column (one of my dubious claims to fame while at eBay) but alas it's not sortable.

2) Microsoft's Netscan and parallels on The WELL Data-mining Usenet was inevitable and Microsoft's results are, imho, just about what anyone who's used conferencing and newsgroup systems for years would expect to find. For example, similar data would probably be found in The WELL community if you mined their archives.

The WELL doesn't have a reputation system for its users, but that doesn't stop its users from coming up with one. On The WELL, there's been a feature for years called "bozo filtering", what you might call a "kill file" --- if your name is in someone's ".blist", they never see your postings when they come across them in online conference postings. At some point years ago, users started making their ".blist" files readable by all --- and sure enough, someone wrote a tool to monitor all the open ".blist" files and generate statistics: who are the most bozoed users on The WELL, and which users are bozoing the most people. Whenever any user updates their publicly-readable bozo list, the updates (those names that are added or removed) are published as well. So people can see when someone falls out of favor with someone else, or vice versa. All this political activity within the community of course generates a running commentary by members of the community.

3) Some thoughts re Future Implications/Applications

a. Given the Teacher and Professor sites, I'm wondering when we'll see RateTheBoss, where current and former, gruntled and disgruntled, underlings tell all about their managers. While we're at it, where's RateTheVC? :-)

b. Conferences: Every conference strives to elicit some sort of evaluation from participants. But the data is rarely if ever shared back with the participants. Perhaps a RateTheSpeaker or RateThePanelist (RateThePundit?) service is needed. It'd be great to see what the attendee-generated reputations of speakers and panelists looked like.

c. At some point people are going to start (if they haven't already) to include online reputation summaries in their resumes, just like companies already do when they brag about their J.D. Power rating in advertisements.

d. I wouldn't be surprised if someone eventually aggregates the disparate online reputations of individuals and offers a Fair Issac-style profile rating. A summary might include that the person in question is a great eBay seller but a slow-to-respond buyer, so-so reviewer of products, excellent teacher, popular blogger. Who would value such information? What's to prevent such aggregation? What outside communities would value the reputations of individuals within other communities?

e. Workplace whuffie, performance evaluations, etc.: same sort of thing as "d" above, but internal to the enterprise. Systems giving everyone in the company the means of identifying who's the "go-to" person for any information, expertise, tight-deadline-deliverable (i.e., who are the miracle workers) etc.

4) Recommendations for Online Companies One thing I was kind of looking for but didn't get a strong sense of in the article was any strategic recommendations to current online businesses about what they ought to do regarding reputation systems. Perhaps it's beyond the scope of the article. I guess it depends on the audience. It'd also be nice to know if there are any case studies of online reputation systems that didn't work out, or backfired, or were shut down because of community reaction. I don't know of any, but I (and I bet Release 1.0 readers) would be curious to hear about such instances.

5) Link: Paul Resnick's site has a link to an interesting site, the "Reputations Research Network." http://databases.si.umich.edu/reputations Might be worthy of direct mention in the Resources section.

Posted by brian at November 1, 2003 03:09 PM

brianstorms is Brian Dear's weblog. Non-spam email:

Be sure to take a look at these other fine websites:

Copyright 2002-2003 Birdrock Ventures. brianstorms is a trademark of Birdrock Ventures.