Social Media Tries to Stop Spread of New Zealand Shooting Video

Brenton Tarrant

Brenton Tarrant

Brenton Tarrant, an Australian who carried out the deadly attack in the city of Christchurch, had live-streamed the attack and was later taken into custody after he was identified in the video, AFP reported.

"A person involved with the attacks also appeared to post regularly to the "/pol/ - Politically Incorrect" forum on 8chan, a online discussion site known for allowing virtually any content, including hate speech. We're also removing any praise or support for the crime and the shooter or shooters as soon as we're aware. "We are working to have any footage removed".

YouTube said: "Please know we are working vigilantly to remove any violent footage".

"The responsibility for content of the stream lies completely and exclusively on the person who initiated the stream".

"Our hearts are broken over today's bad tragedy in New Zealand", the video portal said in a Twitter posting. "More than anything, social media has provided a platform for sharing extremist views".

The incident has once again called into question major social media platforms' ability to police harmful content.

"We have certain companies that have built systems that have inadvertently served the cause of violent hatred around the world", Vaidhyanathan said.

Shares of Facebook closed down 2.5 per cent on Friday.

Researchers and entrepreneurs specializing in detection systems said they were surprised that users in the initial hours after the attack were able to circumvent Facebook's tools.

Man who scared away Christchurch mosque gunman hailed a hero
But he drove away and Mr Aziz said he chased the auto down the street to a red light, before it made a U-turn and sped away. The gunman then got into his auto and drove away yelling at Aziz that he would kill them all.

He then walked outside, shooting at people on a sidewalk. Multiple people were killed during shootings at two mosques full of people attending Friday prayers.

He walked back outside, shot a woman, got back in his auto, and drove away.

Because it's 2019, and livestreaming has had five years or so to really build up into a mainstream activity that people actually do, this means that horrific acts of violence and terror around the world have a greater-than-zero chance of having some video component attached to them.

Twitter has also been battling to remove shared videos.

Once a video is posted online, people who want to spread the material race to action.

"Remember, lads, subscribe to PewDiePie", the gunman said. Seems that most agree on that. "My heart and thoughts go out to the victims, families and everyone affected", he said.

But Jennifer Grygiel, a Syracuse University communications professor who follows social media, said the companies were doing far too little to prevent the spread of violent content.

"This is a case where you're giving a platform for hate", he said. That same year, a video of a man shooting and killing another in Cleveland, Ohio, also shocked viewers. "Subreddits that fail to adhere to those site-wide rules will be banned".

U.S. President Donald Trump posted a tweet condemning the "horrible massacre", as did former leader Barack Obama. This makes for a tricky balancing act for the company. Facebook says it does not want to act as a censor, as videos of violence, such as those documenting police brutality or the horrors of war, can serve an important goal.

Recommended News

We are pleased to provide this opportunity to share information, experiences and observations about what's in the news.
Some of the comments may be reprinted elsewhere in the site or in the newspaper.
Thank you for taking the time to offer your thoughts.