RACHEL MARTIN, HOST:
OK, so you have just hopped out of an Uber or a Lyft, for example. Let's say the ride has not really gone that well. You pop open the ride-sharing app to give a rating to the driver. You can choose anything from one to five stars. What you do next turns out to be subject to the laws of psychology.
New social science research reveals a bias among many raters that produces pernicious effects. To explain, we are joined by NPR's social science correspondent Shankar Vedantam. Hey, Shankar - pernicious effects.
SHANKAR VEDANTAM, BYLINE: (Laughter) Indeed, Rachel.
MARTIN: OK, so explain. What's the bias here?
VEDANTAM: Many people are hesitant to give drivers a bad rating. So the driver might have been rude. The car might have been dirty. You might have nearly gotten into a crash. Objectively, the ride might deserve a rating of one star out of five.
MARTIN: Right.
VEDANTAM: But I was talking with John Horton. He's a business school professor at New York University. He told me what goes through many riders' minds as they decide what rating to give.
JOHN HORTON: There's quite a bit of evidence that people don't rate people harshly in these systems, even if they maybe had a bad experience or a not great experience just because they don't want to harm the other person. And you can kind of think, if you have a bad Uber ride, you know, you may be unhappy as a passenger. But you don't want to ruin a person's livelihood.
MARTIN: That's so interesting because we've just had, like, a personal experience with an Uber or Lyft driver, and because of that, we're less likely to give them a bad rating.
VEDANTAM: Exactly. So people understand that drivers who get poor ratings might get kicked off the platform. And you don't want one bad ride to lead to that. So instead of one or two stars, maybe you give three or four. Maybe you even give five. Any platform where you are rating people - from your professors, to someone on eBay, to Airbnb - potentially has this problem.
Horton and his colleagues, Joe Golden and Apostolos Filippas, asked the question, what are the consequences of this kind of biased grading? They analyzed data from a large online platform with over a billion dollars in transactions. They find that when the platform tells users that the feedback is going to be private and that it won't be used to punish providers, users start to provide much more critical feedback. As Horton crunched the data, he realized that he is tempted to do the very same thing himself with his students, give them higher grades than they deserve.
HORTON: I have to give grades as a professor. And, you know, no one in the history of teaching I think has ever come in to complain about an A. When you give low grades, that's when it sort of becomes personally costly to you, where people want to meet and talk. And so kind of giving the high grades is in some ways the easier thing to do in most cases.
MARTIN: Fascinating. So it's just more of a burden, so they go the easier route. But Shankar, getting back to that example of the Uber or Lyft driver, what does it mean about me if I'm the person who's like, yeah, you almost killed me. You get two stars if not zero stars. I'm just a bad person? I'm not being empathetic.
VEDANTAM: Well, here's the thing. I think many people believe they are being empathetic when they give the driver a very good rating for bad service. But Horton and his colleagues would actually argue, Rachel, that you are the kind of person who is the altruist. When you give five stars...
MARTIN: Do tell. Do tell.
VEDANTAM: ...When you give five stars to a bad driver who nearly gets you into a crash, you're exposing the next rider to a certain elevated risk. When you look at it this way, the people who give accurate ratings deal with the immediate backlash of the discomfort of giving a low rating to a bad driver. But they provide better information to the next rider or the next professor. It's easier to pass the bad apple along. You, Rachel, are the altruist.
MARTIN: That's the bottom line I wanted to get to. Shankar Vedantam is NPR's social science correspondent. He's also the host of a podcast exploring the unseen patterns in human behavior called Hidden Brain. Shankar, thanks so much.
VEDANTAM: Thank you, Rachel.
MARTIN: I'll give you five stars, by the way, five stars.
VEDANTAM: Oh, thank you.
(SOUNDBITE OF WALTER WANDERLY'S "CALL ME")