Home Technology Why better technology wont stop violent videos going viral

Why better technology wont stop violent videos going viral

11 min read
Comments Off on Why better technology wont stop violent videos going viral
0
20

Close-up of an eye with the Facebook logo reflected in it

On Easter Sunday, a man chose 74-year old Robert Godwin Sr at random on a street in Cleveland, Ohio, shot him, and put the video on social media. In the video, Godwin is instructed to name the killer’s ex-girlfriend, and tell the camera that she is the reason he is about to die.

Sadly, this is only the latest nefarious use of social video. As increasingly easy-to-use video applications have lowered the barrier to entry, we have seen it spread from beheadings by Islamic State, gang rapes, suicides and now to a murderer simultaneously terrorising a woman from afar for ending a relationship with him.

Predictably, the event has unleashed a wave of criticism of Facebook. What responsibility should the company bear for preventing these kinds of videos? It’s easy to demand technological fixes from the platforms that host these videos, but that both evinces a naivety about what technology is actually capable of, and also elides some uncomfortable truth about who else is to blame.

To be sure, Facebook has not covered itself in glory. Its initial response to the video tellingly referred to the murder as “content”. It was reportedly at least two hours before the video was taken down. Not good, in light of the company’s history of profiting handsomely from video while not taking any of the responsibility for vetting the footage – it is not subject to the same restrictions as traditional broadcasters.

This is made worse by emerging research that suggests such footage could lead to more murders. “It would not surprise me that murder would be contagious,” says Sherry Towers at Arizona State University, who has found that there is a relationship between how widely a gruesome act gets shared and the likelihood of copycat violence. “People who are otherwise mentally distressed could get the idea that [this] would be something they would want to do.”

AI censorship

Most of the solutions being demanded of Facebook are technical. For example, people point to the 3-second delays used by broadcasters to avoid the unexpected. Or better algorithms to flag and remove content before it is shared.

That would certainly be helpful. If violent imagery makes it into a broadcast, its prompt removal can help prevent it gaining popularity, says Todd Helmus, a behavioral scientist at the Rand Corporation who has studied how ISIS propaganda videos spread online. For example, Helmus says the number of people sharing posts supporting ISIS on Twitter dropped dramatically in 2014 and 2015, around the time the platform began removing users who were the source of violent ISIS propaganda.

It’s harder to police the accounts of all Facebook’s 1.86 billion users for a broad range of violent content, which has led to calls for AI to step in and censor problematic videos before they can enter the network. “You can develop machine learning capabilities to detect words or phrases on videos and images, like an ISIS flag, for example, and pin those for removal,” Helmus says.

Leaving aside the fact that right now, asking for AI to police Facebook for such material is the equivalent of asking for a magic spell, do we want all violent video to be pre-emptively culled by an algorithm? “Some kinds of violence shouldn’t be removed,” he says. Last year, Diamond Reynolds uploaded a video of her boyfriend Philando Castile being fatally shot by a police officer while reaching for his ID in his car. The video set off protests and may have been instrumental in the police officer’s conviction. In the case of witnessing police violence or, for example, providing testimony to the effects of chemical warfare, video is key to raising awareness and holding those in power accountable. “The images we’ve seen this week are graphic and heartbreaking, and they shine a light on the fear that millions of members of our community live with every day,” Mark Zuckerberg posted after the Philando Castile video.

The bystander effect

At whose discretion do some violent videos stay, while others go? AI is certainly not capable of making fine distinctions yet; Facebook’s process is still unable to differentiate between photos of breastfeeding mothers or home births and pornography, or iconic war photography and child exploitation. And this is an easy task compared with deciphering intent – a task with which even a human curator might struggle.

Whatever sophisticated algorithms we develop in the future, the question of censorship might depend more on human nature than on technology possibility. “The reason media stories about mass killings or things like this murder are so popular is that the public has a desire to read about it. Especially when they go viral on Facebook, that means people are actively sharing it. Why do we do this?” says Towers.

Even after Facebook removed the video, the footage circulated on other platforms. In one of those posts, the video had 1.6 million views. “We are drawn to this imagery,” says Towers.
Facebook says the delay in removing the footage was because it took that long for the first users to report it for problematic content, something that may implicate us again – the bystander effect is well known to dampen our motivation to intervene in an emergency. Our desire to watch this kind of imagery overrides our instinct to do the right thing and report it.

That may offer a clue to how Facebook should respond. Eventually, the company will need to take responsibility for the content that is on its site, and the way to do that is by paying more human employees to take on that role, instead of offloading the problem either to as yet notional AI or onto the unpaid volunteers it counts on to police each others’ content. It’s not letting the company off the hook to say Facebook is only part of the problem. We bear some of the blame for circulating the content, and there’s no app for that.

[Source:-thenewscientist]

Load More Related Articles
Load More By Loknath Das
Load More In Technology
Comments are closed.

Check Also

PlayStation 5 Star, Sony makes appealing other option

The lines between very good quality computers and computer game control center are obscuri…