Over the past weeks, the number of violent videos posted on Facebook has dramatically gone up. Nevertheless, Facebook is determined to stop violent videos and CEO Mark Zuckerberg explained the changes they implemented to the review process.
Revamped review process and expanded staff
The new review process has been revamped to ensure reporting, and removing these videos will be quicker and overall more efficient. Mark Zuckerberg is aware that Facebook needs to act fast and it looks like they’re already working to hire 3,000 additional employees for reviewing flagged videos, on top of the 4,500 people they already have employed.
The reviewers analyze the flagged videos and any other content that’s not compliant with Facebook’s terms. Furthermore, they’re responsible for communicating with law enforcement and regional community groups if they learn someone is in danger or planning to harm themselves.
New and improved tools
Obviously, Facebook can’t rely on manual work alone for removing dangerous, offensive or violent content from their network and according to Zuckerberg, the company is also developing new and improved tools for this purpose.
Their goal is to simplify reporting options so their reviewers can act faster when such is the case. One of the technologies employed by Facebook to curb graphic content is artificial intelligence that actually prevents such videos from being shared in the first place.
Last month, a man in Cleveland posted a video of himself shooting and killing another man which remained public on Facebook for over two hours. Safe to say, the company has received a lot of negative feedback on the matter but they did recognize the problem and it seems they’re also taking steps to prevent such events from happening.
Another issue Facebook is facing is the monitoring of Live videos. Since in this case events are streamed in real-time, graphic content might become viewable (even though for a little while) before Facebook’s staff can react and take it down.
While it’s encouraging that Facebook is working to clean up their social network of violent videos it remains to bee seen if their efforts will actually be effective. Considering the network has about two billion users, a staff of 7,500 reviewers seems to be insufficient for removing flagged videos at a decent speed.