It is reported that the original video is up to 17 minutes, and now there are some videos that have been edited for about one minute, and can be viewed by anyone. In one example, Facebook did not delete the video showing terrorist shooting and murdering innocent civilians from a first-person perspective, but simply marking the clip as likely to contain "violent or graphical content." Videos with this tag require Facebook users to click to confirm that they want to view the material.
The news highlights Facebook's continued failure to remove one of its most compelling white-sexist terrorism videos from its platform.
“These more than a month of horror video posts still appear on Facebook and Instagram, and Facebook needs to rethink its artificial intelligence and regulators,” wrote Eric Feinberg, founder of the Global Intellectual Property Enforcement Center, in an email to Motherboard.
A version of the Christchurch shooting video on Facebook is not the original lens itself, but a video of the live video. Another version of the video on Facebook is a screenshot showing someone watching a portion of the attack on Twitter.
The appearance of these lenses raised some questions for Facebook. The company told Motherboard that because users are uploading variants of these videos, Facebook is also using audio technology to try to detect fragments of the attack. (Some of the uploader’s terrorist attacks are also common, for example, adding a black border to a piece of content to bypass the social media company’s detection system.) Once deleted, the company adds each video variation to its list, content Will be automatically blocked. Facebook told Motherboard that it is investing in technology and research to determine the edited versions of these video clips.
One of the footage shows the terrorists walking to the first mosque and shooting. The video does not show a full attack and stops at the 01:15 mark. However, it still shows that many civilians were murdered. Other clips on Facebook and Instagram show similar attacks.
"This video has been automatically overwritten, so you can decide if you want to view it.". It's worth noting that Guy Rosen, vice president of product management at Facebook, said in an article after the attack that Facebook's automated detection system did not find the original video.
"This video violates our policy and has been removed. We have designated both of these shooting incidents as terrorist attacks, which means that any praise, support and statement of the incident violates our community standards and does not allow Posted on Facebook, "A Facebook spokesperson told Motherboard in an email.
It's worth noting that all the clips Feinberg found were from the Arabic page.