Facebook's AI couldn't spot mass murder

Facebook's AI couldn't spot mass murder

Facebook has given another update on measures it took and what more it's doing in the wake of the livestreamed video of a gun massacre by a far right terrorist who killed 50 people in two mosques in Christchurch, New Zealand. Earlier this week the company said the video of the slayings had been viewed less than 200 times during the livestream broadcast itself, and about about 4,000 times before it was removed from Facebook -- with the stream not reported to Facebook until 12 minutes after it had ended. It also previously said it removed 1.5 million versions of the video from its site in the first 24 hours after the livestream, with 1.2M of those caught at the point of upload -- meaning it failed to stop 300,000 uploads at that point.