The basic principle is that the user agrees to respect the “rules” of the social network. At Facebook, we call this the “standards” of the community.
The risk, if not, is to see Facebook's deleted posts, to be encouraged, on Twitter, to “delete banned content to be able to publish again“or to see his account temporarily or permanently suspended.
Prohibited violent content
On Twitter as on Facebook, violence is prohibited. That said, the two networks are well aware that what can be perceived by one user as violent is not necessarily by another and that sometimes the local context, cultural, political, language used, can play. The networks indicate to take this into account.
Facebook talks about the “credibility” of violence or threats. “When we believe that content represents a real risk of physical harm or a direct threat to public safety”, says Facebook, “we delete the content, disable the relevant accounts and collaborate with the authorities. “
Also prohibited on both networks are content related to dangerous individuals or organisms, the promotion of crimes, the call to organized violence, hate speech.
Rules that have their limits
The limits to these regulations are that they should apply to millions or even billions of people, that the distribution of content is very fast, and that the moderators are not necessarily based in the countries where the content is broadcast. In addition, tools for the automatic detection of problematic messages – or software that detects images or videos posted that are not authorized – do not take into account the context: for example, that users relay certain messages to better denounce them, or the fact that a nudity scene has an artistic character.
It is also very difficult to fight against content that surfs with the illicit, which circumvent the detection systems while being widely disseminated. Especially since in general, the more controversial the content, the more it generates commitment (comments, reactions …)
Regulation on reports
Facebook, Twitter and especially encourage users to report quickly unwanted content, or false information. Both networks have recently shown their willingness to respect European rules.
Facebook, particularly blamed on these issues, says regularly modify the rules of moderation to adapt to new problems that appear and have greatly increased the number of moderators.
Facebook also, last month, announced the creation of a kind of “court of appeal” that will decide on controversial content: whether they can stay online or not. The social network will also publish, from next year and every three months, a report on the contents that have been removed from the site.