One word: Humans.
Snap has proven it can get a scoop without sacrificing reliability. During the white nationalist rally in Charlottesville, Va., in August, Nick Bell’s team assigned a producer in New York to create dispatches for Our Stories. That meant scanning public Snapchat posts from within a few blocks of the protests, and gathering video and interviews from Snapchat-using journalists on the scene. Around 3 p.m. on Aug. 12, a short video clip posted on Snapchat appeared to show police arresting James Alex Fields Jr., the man who allegedly drove a car into a crowd of counterprotesters, killing 32-year-old Heather Heyer and injuring 19 other people. On Facebook, Twitter, or YouTube, the footage would have gone viral before it could be confirmed; and, in fact, screenshots of the Snapchat video appeared on other social networks almost immediately. But rather than post the clip widely, a Snapchat producer spent hours comparing its time and location data with other users’ footage of the attack, and repeatedly called and texted Charlottesville Police Department officers in an attempt to verify the arrest.
The clip appeared in Our Stories at about 7 p.m., after the Snapchat producer spoke to police. Even then, the producer replaced the user’s caption (“got em”) with a more cautious statement that the video “appears to show an arrest” of the suspected attacker. Snapchat anchor Peter Hamby and the head of original content, Sean Mills, both signed off on the post. “It’s not just humans making judgment calls,” Hamby says. “We make phone calls.”
That this qualifies as a boast is a testament to how poorly other tech companies have acquitted themselves in presenting news. In the days after the Aug. 12 attack, Facebook, Google, and Twitter were flooded with evidence-free stories suggesting that Charlottesville was a “false flag” attack perpetrated by left-wing extremists, Jews, and/or extreme left-wing Jews. Later that week, Facebook began deleting links to an article published by the neo-Nazi website the Daily Stormer that circulated widely on the social network. The article called Heyer a “fat, childless … slut.”
In October, after the murder of 58 concertgoers at a Las Vegas country music festival, Google News featured a story that identified an innocent man as the shooter. The publisher of that story: 4chan, an anarchic online forum known for allowing racism, misogyny, conspiracy theories, and trolling. Google apologized, promising it would “make algorithmic improvements to prevent this from happening in the future.”
That kind of material wouldn’t make it far on Snapchat, Hamby says, because “we’re in essence a walled garden.” As an example, he says, if somebody tried to post a phony video of a shark swimming through the streets of a hurricane-devastated city, Snapchat’s editors would catch it and make sure it didn’t find a wider audience on the service. “You can’t introduce a shady article without hitting a layer of editors,” he says. The urban shark example wasn’t hypothetical. In September a video that purportedly showed sharks flagrantly violating Miami traffic laws after Hurricane Irma racked up thousands of mentions in Facebook’s News Feed, despite being repeatedly debunked by Snopes.com and others. It has been viewed more than half a million times on YouTube.