Facebook says no one flagged NZ mosque shooting livestream

Facebook removed the video ‘within minutes’ of being notified by police, said Chris Sonderby

Facebook says none of the 200 or so people who watched live video of the New Zealand mosque shooting flagged it to moderators, underlining the challenge tech companies face in policing violent or disturbing content in real time.

The social media giant released new details about its response to the video in a blog post. It said the gunman’s live 17-minute broadcast was viewed fewer than 200 times and the first user report didn’t come in until 12 minutes after it ended. Fifty people were killed at two mosques in Christchurch.

Facebook removed the video “within minutes’” of being notified by police, said Chris Sonderby, Facebook’s deputy general counsel.

“No users reported the video during the live broadcast,” and it was watched about 4,000 times in total before being taken down, Sonderby said. “We continue to work around the clock to prevent this content from appearing on our site, using a combination of technology and people.”

Facebook has previously said that in the first 24 hours after the massacre, it removed 1.5 million videos of the attacks, “of which over 1.2 million were blocked at upload,” implying 300,000 copies successfully made it on to the site before being taken down.

The video’s rapid spread online puts renewed pressure on Facebook and other social media sites such as YouTube and Twitter over their content moderation efforts. Many question why Facebook in particular wasn’t able to more quickly detect the video and take it down.

On Tuesday, New Zealand Prime Minister Jacinda Ardern expressed frustration that the footage remained online four days after the killings. She said she had received “some communication” from Facebook’s Chief Operating Officer Sheryl Sandberg on the issue. “It is horrendous and while they’ve given us those assurances, ultimately the responsibility does sit with them.”

READ MORE: Quebec City mosque shooter ‘very affected’ by New Zealand massacre

READ MORE: New Zealand mosque shooter brandished white supremacist iconography

Facebook uses artificial intelligence and machine learning to detect objectionable material, while at the same time relying on the public to flag up content that violates its standards. Those reports are then sent to human reviewers who decide what action to take, the company said in a video in November , which also outlined how it uses “computer vision” to detect 97 per cent of graphic violence before anyone reports it. However, it’s less clear how these systems apply to Facebook’s live streaming.

To report live video, a user must know to click on a small set of three grey dots on the right side of the post. When you click on “report live video,” you’re given a choice of objectionable content types to select from, including violence, bullying and harassment. You’re also told to contact law enforcement in your area if someone is in immediate danger.

Before the company was alerted to the video, a user on 8chan had already posted a link to copy of it on a file sharing site, Sonderby said. 8chan is a dark corner of the web where those disaffected by mainstream social media sites often post extremist, racist and violent views.

In another indication of the video’s spread by those intent on sharing it, the Global Internet Forum to Counter Terrorism, a group of global internet companies led by Facebook, YouTube, Microsoft and Twitter, said it added more than 800 different versions to a shared database used to block violent terrorist images and videos.

The group said it added “digital fingerprints” for visually distinct versions of the video to its database. The move came in response to attempts by internet users to share the video by editing or repackaging versions with different digital fingerprints to avoid detection.

“The incident highlights the importance of industry co-operation regarding the range of terrorists and violent extremists operating online,” said the group, which was formed in 2017 in response to official pressure to do more to fight online extremism.

In a series of tweets a day after the shootings , Facebook’s former chief security officer, Alex Stamos, laid out the challenge for tech companies as they raced to keep up with new versions of the video.

“Each time this happens, the companies have to spot it and create a new fingerprint,” said Stamos. “What you are seeing on the major platforms is the water leaking around thousands of fingers poked in a dam,” he said

Stamos estimated the big tech companies are blocking more than 99 per cent of the videos from being uploaded, “which is not enough to make it impossible to find.”

Kelvin Chan, The Associated Press

Like us on Facebook and follow us on Twitter.

Get local stories you won't find anywhere else right to your inbox.
Sign up here

Just Posted

Mission Institution has highest number of positive results for COVID-19

11 inmates test positive for coronavirus, more than any other federal prison in Canada

VIDEO: A message from Mission’s mayor

Pam Alexis asks residents to support local businesses, stay strong during COVID-19 pandemic

Mission developers donate $10,000, challenge others to donate as well

Gary Toor and Jason Tiegen gve the money to Mission Community Services, hope to raise $20,000

You can design a new banner for Mission’s downtown area

Mission Downtown Business Association is presenting its second annual banner design competition

One woman arrested in Abbotsford on Sunday after stabbing

25-year-old man sent to hospital with serious injuries following incident

UPDATE: Canadians awake to extra COVID-19 emergency benefit money, feds clarify changes

The CRA and federal officials are working to clarify the confusion around payments

B.C. sorting medical equipment sales, donation offers for COVID-19

Supply hub has call out for masks, gowns, coronavirus swabs

B.C. records five more deaths due to COVID-19, 45 new cases

A total of 838 people have recovered from the virus

Major crimes investigating sudden death of North Okanagan child

The 8 year old was flown to Kelowna General Hospital and died hours later

Easter Bunny added to B.C.’s list of essential workers

Premier John Horgan authorizes bunny to spread “eggs-ellent cheer” throughout province

Travellers returning to B.C. must have self-isolation plan or face quarantine: Horgan

Premier John Horgan says forms must be filled out by travellers

More than 400 animals have been adopted amid pandemic: B.C. SPCA

People are taking this time of social distancing to find a loyal companion through the animal welfare group

B.C. secures motel, hotel rooms for COVID-19 shelter space

Community centres, rooms reserved for pandemic self-isolation

Most Read