Washington: Facebook has decided to provide its users with more information about materials which may be banned on the social networking site.
Its revamped community standards now depict a separate section on “dangerous organizations” and give more details about the types of posts which are generally allowed in the site.
The US based social networking site said that the new guidelines would be enough to provide “clarity” about this matter.
One of its safety advisers who praised the move adopted by the social networking site, later exclaimed it as absolutely “discouraging”.
As per the analysis report, about 1.4 billion people are seen using the service of Facebook at least once a moth.
The new guidelines will be replacing the old one that exists currently on the firm’s website, and will be sent to users who complain about posts put by other users.
The global head of Facebook content policy, Monika Bicket, said that the rewrite was supposed to address people having confusion regarding the rejection of take-down requests.
She has also added that they would send its users a message saying that they were not removing it because it didn’t violate their standards.
As per the new guidelines of Facebook, it is mandatory for the users to warn their audience regarding the graphic violence hidden in the messages or images that they are going to see by them.
The users need not add any cover pages with respect to the clips put in, for preventing them from auto-playing.
However, Facebook only adds alerts if it has received a complaint regarding any posts put in it.
Video on new Facebook guidelines