Facebook may lay down new norms or impose restrictions to prevent blatant misuse of its live video streaming feature Facebook Live in the aftermath of the New Zealand mosque attack. Facebook faced global condemnation after 28-year-old Brenton Harrison Tarrant live-streamed the massacre of 50 innocent people at two different mosques in Christchurch.

The terror attack video was viewed over 4,000 times before it was removed by Facebook. Facebook Chief Operating Officer Sheryl Syndberg said the social media platform is going to take three major steps including implementation of tough rules, identification of Facebook Live users and stop the spread of hate speeches on its platforms.

Facebook is soon going to ban people who have previously violated any community standards on their live streaming service. The social network is also investing a huge amount of money in improving the app to quickly identify and highlight the videos related to violence. Sheryl Sandberg said that the New Zealand mosque attack video was shared live and mainly people shared the re-edited videos of the attack which made the systems of the Facebook unable to completely erase the video from the platform. Facebook pinpointed more than 900 altered videos showing parts of the streamed violence.

Facebook is going to use AI or Artificial Intelligence tool in order to recognise and erase hateful nationalism in New Zealand and Australia. This week, the organisation declared that it is going to ban white separatism. The officials of Facebook said that these type of concepts related to hateful nationalism have no place on the social media platforms. People who search for hate-related terms on Facebook will get results like life after hate and other results which will try to mould the users in a positive way so that they will get inclined to leave these hate groups.

For all the latest New Gadgets News, download NewsX App