Article posted on Mashable
The algorithms that surface content for us on Facebook andGoogle are miracles of modern programming. But Eli Pariser, author and chairman of the board at MoveOn.org, has concerns.
In March, Pariser gave a popular TED talk about “filter bubbles” — the idea that when search and social networks only serve us content that we “like,” we’re not seeing content we need. He cited examples where liberal-leaning Facebook friends only see fellow liberals in their “Top Stories,” or a frequent traveler only got tourism results when Googling “Egypt” in the midst of the Arab Spring.
As users increasingly get their news from curated social channels, this trend has the potential to isolate us and damage our world view.
At Friday’s Mashable Media Summit, Pariser offered some solutions, and focused on how human editors and algorithms can work together to get users clicking on content that matters.
7 Things That Personalization Algorithms Do Poorly
Pariser pointed out the critical things that social personalization gets wrong when it comes to content.
- Anticipation: If there’s a small story about a meeting of the Greek parliament today, a human editor could anticipate that stocks might tumble tomorrow. Algorithms are rarely good at making this kind of abstract correlation.
- Risk Taking: For an algorithm to be successful, it needs to be right most of the time. Suggestion engines almost always offer up “safe” content within a very narrow spectrum. Human editors have the will to take risks on content that might be wildly successful (or fail miserably).
- Big Picture: Algorithms seldom connect the dots between specs of content to form a big picture of current events. An editor can create a front page (today, a homepage) that shows the news of the day in context, and arranged by importance.
- Pairing: Human editors can draw you in with something “clicky” and get you to stick around by pairing that item with something of substance. This can be an art more than a science, which is why algorithms come up short.
- Social Importance: Algorithms are good at surfacing what’s popular but not necessarily what’s important. The war in Afghanistan may not be “likeable” or “clickable,” but a human editor can ensure that stories about it get seen.
- Mind-Blowingness: Pariser spoke about the Napoleon Dynamite problem on Netflix. Users either loved the movie (rated it five stars) or hated it (one star). Because the Netflix algorithm doesn’t like making risky recommendations, it often eschewed Napoleon from suggestion lists — even though people who like the movie really like the movie.
- Trust: People learn to trust good editors. If something seems boring or irrelevant but a trusted editor says it’s important, you’ll heed. Algorithms may never be so trustworthy.
How Do We Fix It?
In his talk, Pariser noted that nearly every major online media company and platform is moving toward some level of personalization. And why not? It drives clicks and engagement, which drives revenue.
But how can we create balance? For his book The Filter Bubble, Pariser asked the big platforms (Facebook, Google and Netflix, among others) about the difference between implicit and behavioral intent.
“In this era where we have data about everyone, do you trust behavioral data, or what people actually say they want?” he posed from the stage of the Media Summit. “If you don’t trust what users say they want, then users lose agency. You’re just sending them things they will click on.”
It’s clear the current platforms don’t get us there on their own. But by striking a balance between editors and code, Pariser thinks we can get the best of both worlds.
“The great thing about the Internet is that it’s a very malleable thing,” he said. “It’s not a medium, it’s a meta-medium.”
By hooking people with content users like and pairing it with content users need, editors can drive traffic and value simultaneously.
“How do we make hard news as irresistible as LOLcats? That is what news is competing with. We need to find new ways of packaging it,” Pariser asserted.
“The Internet can go either way. It can encapsulate us in a little bubble of our narrow interests, or it can connect us to new people and ways of thinking.” The latter is what we all hoped for, Pariser said. And his hybrid media strategy might be one way to save us from creating “a bubble of one.”