Filtering Your World Is Understandable—But It's Not Helpful
What if you could cut out all the parts of a movie you didn't want to see? Showing The Martian to your niece? Farewell, f-bombs and Matt Damon's butt. You're a discerning Star Wars fan? See ya later, Jar Jar Binks. That's the genius of VidAngel, a Utah startup that launched in 2014 with the idea of giving consumers the ability to filter films. By selecting tags to scrub out graphic violence, nudity, and even "blasphemy," viewers could watch a version of, say, Sausage Party or Bad Grandpa without offending their sensibilities.
Movie studios, as you might imagine, weren't keen on the idea of someone editing their content without permission and sued. Well, the service is back, ready to let you sanitize Netflix, Amazon, or HBO titles to your heart's content. That's not to say VidAngel's new strategy of limiting its filtering to streaming services will banish copyright lawsuits. But its persistence highlights a core tension in the cultural conversation: Giving people tools to protect themselves from objectionable content sounds like a no-brainer—but can act a lot like censorship.
Related Stories
This isn't a niche issue. VidAngel CEO Neal Harmon says the service saw more than 3 million unique visitors the day it relaunched. "When we did a survey, just over 40 percent of Americans said they would filter," Harmon says. Nor is wanting a social media feed free of beheading videos or harassment uncommon: It's a problem that makes headlines just about every week, and something we at WIRED have written about extensively.
Why people want those controls varies. VidAngel's survey found that people who consider themselves highly religious wanted filters for themselves, while less religious folks wanted it only for their kids. Conservatives wanted to avoid sex and "taking the Lord's name in vain," while liberals favor excising violence and racial slurs. Social media users have similarly multifaceted wants and needs, from people who want to be sheltered from the internet liberal outrage machine to people who want to be insulated from triggering material.
No wonder competitors like ClearPlay and movie studios like Sony are getting into the clean-up-your-show business. No wonder every social media platform is drowning in content moderation drama. There's so much stuff out there, and so many people. Which ushers in the first problem with these services: The filterees far outnumber the filterers.
The People Behind the Curtain…
Filtering objectionable content out of a tidal wave of posts requires a legion of humans or an algorithm trained by a legion of humans. Either way, how those humans see the world dictates how they interpret "objective." "It's difficult to apply global standards to subjective information," says Kate Klonick, a lawyer at Yale who studies private platform moderation of online speech. Hence the outrage over YouTube marking innocuous videos by LGBTQ content creators as not family friendly, or Facebook and Instagram removing photos of mothers breastfeeding. And the equal but opposite backlash against Facebook for (eventually) not removing photos of gay people kissing.
Because social media users are so politically polarized, the perspectives and biases of filterers—who, ultimately, are deciding what your world looks like—matter even more. When I asked Harmon about the workers who apply the filtering tags VidAngel customers can choose from, he said, "To be candid, Christians are probably overrepresented, but I couldn’t really even say. We have not done proper research on who our taggers are, other than they're in the United States and they’re geographically diverse."
That opens the door for highly subjective application of tags—a move that makes filtering look a whole lot like censorship. "VidAngel isn't unusual in this. It's been a latent problem for years," says Sarah T. Roberts, a media studies scholar at UCLA1 who studies commercial content moderation. Roberts notes that Harmon's nonanswer highlights a shadowy problem with online content moderation: a disregard for the impact watching hour after hour of potentially objectionable content has on the moderators.
…and the People in Your Living Room
Fortunately for VidAngel's taggers, most movies don't contain the kind of vile, extremist content that unfiltered social media feeds do. But this kind of content moderation isn't only harmful to the moderators. Whittling down your worldview is not only limiting, it’s also probably unnecessary. "People tend to overestimate how much what they think of as 'undesirable' content influences others," says Albert Gunther, a media and communications scholar who studies information processing and its effect on public opinion. "The more they overestimate, the more they favor restrictions."
Regardless of why parents might embrace filtering, and regardless of their moral leanings or political motivations, the upshot may be the same. "It won't help kids understand the world if you 'cleanse' gay people out of their existence," Roberts says. "The people who don't want to hear the violence of the N-word in 42 are also distorting history."
Smoothing the edges of your own world isn't any better. "Academics call it selective exposure," Gunther says. "It's the idea that people, when they have a choice, will actively avoid unpleasant or disagreeable content." The easier that becomes, the more limits it places on all kinds of provocative expression—including artistic. Which is why writers and directors are so angry about Sony offering clean versions of their movies.
People grow and evolve by being challenged. Letting them tailor the digital world to their exact specifications may hold everyone back at a time when people are trying to learn to live with one another—online and IRL. "Groups that aren’t able to interact with each other become more polarized," says Nicole Ellison, a professor of information at University of Michigan who studies social media. "And I wonder about the psychological implications of wondering about what’s going on on the other side."
Relying on the kind of keyword-based content moderation offered by VidAngel and Sony and Instagram, and even alt-right social media hub Gab, means that you'll never know—because you've decided not to. That's not to say that there shouldn't be safe spaces, or that all speech is 100 percent dandy. But blinkering yourself is no substitute for the thinking person's conscientious digital objection: choosing not to stream that movie—or clicking Unfollow.
1Correction (July 6th, 2017, 6:30PM ET): This piece has been updated to accurately reflect Sarah Roberts' professional affiliation.
Related Video
Culture
How the Internet Tricks You Into Thinking You're Always Right
A guide to busting through confirmation bias, the cognitive fallacy that's destroying our discourse.