In the movie Her, Joaquin Phoenix’s character falls in love with his computer’s operating system, which through the magic of machine learning — and Hollywood — comes to know and understand him better than anyone else. It’s a futuristic critique of human reliance on technology. But according to one new study, it’s a future that may not be all that far away.
This week, researchers from the University of Cambridge and Stanford University released a study indicating that Facebook may be better at judging people’s personalities than their closest friends, their spouses, and in some cases, even themselves. The study compared people’s Facebook “Likes” to their own answers in a personality questionnaire, as well as the answers provided by their friends and family, and found that Facebook outperformed any human, no matter their relation to the subjects.
That’s a substantial finding, the researchers say, particularly given the fact that human beings are evolutionarily designed to have good personality judgement. It’s what keeps us out of danger and influences our relationships. But the realization that, perhaps, computers are better equipped to make these judgements than humans are could help cut through the natural bias that pervades human interactions. Never mind what this says about how much of power Facebook wields.
“We’re walking personality prediction machines,” says Michal Kosinski, a computer science professor at Stanford, “but computers beat us at our own game.”
The researchers began with a 100-item personality questionnaire that went viral after David Stillwell, a psychometrics professor at Cambridge, posted it on Facebook back in 2007. Respondents answered questions that were meant to root out five key personality traits: openness, conscientiousness, extraversion, agreeableness, and neuroticism. Based on that survey, the researchers scored each respondent in all five traits.
Then, the researchers created an algorithm and fed it with every respondent’s personality scores, as well as their “Likes,” to which subjects voluntarily gave researchers access. The researchers only included “Likes” that respondents shared with at least 20 other respondents. That enabled the model to connect certain “Likes” to certain personality traits. If, for instance, several people who liked Snooki on Facebook also scored high in the extroverted category, the system would learn that Snooki lovers are more outgoing. The more “Likes” the system saw, the better its judgment became.
In the end, the researchers found that with information on just ten Facebook “Likes,” the algorithm was more accurate than the average person’s colleague. With 150 “Likes,” it could outsmart people’s families, and with 300 “Likes,” it could best a person’s spouse.
What’s more, at times, the Facebook model could beat the subjects’ own answers. As part of the survey, the researchers also asked respondents to answer concrete questions such as how many drinks they have a week or what type of career path they’ve chosen. Then, they tried to see if they could predict how many drinks someone was likely to have in a week based on their answers to the personality test.
Once again, they found that Facebook Likes were a better indicator of people’s substance use than even their own questionnaires were. “When people take the questionnaire, they present themselves in a slightly more positive way than they really are,” Kosinski says. “This tendency to self-enhance makes computers slightly more objective.”
While the researchers admit the results were surprising, they say there’s good reason for it. For starters, computers don’t forget. While our judgment of people may change based on our most recent — or most dramatic — interactions with them, computers give a person’s entire history equal weight. Computers also don’t have experiences or opinions of their own. They’re not limited by their own cultural references, and they don’t find certain personality traits, likes, or interests good or bad. “Computers don’t understand that certain personalities are more socially desirable,” Kosinski says. “Computers don’t like any of us.”
That said, there are limitations to what computers can understand, too. They can’t read facial expressions or take subtle cues like humans can. And Kosinski admits that a study like this is likely to be much more effective among a younger subset of people, who are more likely to share their personal information on Facebook.
Still, Kosinski rejects the notion that Facebook “Likes” reveal only the most superficial aspects of someone’s personality. “I think it’s the other way around,” he says. “I think the computer can see through the prejudice we all have.”
That, he says, could have implications far beyond Facebook. Sure, this trove of personality data could turn Facebook into more of an advertising powerhouse than it already is. But more importantly, Kosinski says, it could help all of us from being stereotyped or categorized based on other people’s biases. “Computers don’t care if you’re a man, woman, old, young, black, or white,” Kosinski says. “This gives us a cheap, massive, fake-proof algorithm to judge the personality of millions of people at once.”
Zachary T. Brown