Have you ever hit the ‘report abuse’ button on Facebook, when something has upset or annoyed you in your timeline?  Do you know what happens when you do so?

Facebook’s process for dealing with this kind of report is necessarily quite complex, and this is compounded by the sheer volume of content the site is processing daily.  They do not release data about the level of reports received, but it’s sufficient to support a team of ‘hundreds’ handling reports in 24 languages, aiming for a 72 hour response time, and with bases in Ireland and India as well as the US for timezone coverage.

Naturally their first priority is to differentiate the kind of violation that is being reported to them – because there is of course a huge difference between how an allegation of spam or an inappropriate account name is processed, compared with threatening behaviour or hate speech.

One of the first things Facebook clarifies if you hit the Report button is whether the complaint pertains to you and your own account, or to something/someone else.

If your account is hacked and you cannot log in to it, you get routed to http://fb.com/hacked and can take steps to restore it, similarly if you are the victim of an impostor account (where someone sets up an account in your name purporting to be you, perhaps to exploit your friends’ personal information or generosity. If you receive a friend request from someone who you know is already on your friends list check it carefully first, because its very easy for someone to nick an image online and create an impostor account).

Facebook has set processes for dealing with both of these occurrences but they will not help you if a rogue app has been posting on your behalf, you need to review your permissions to fix that – if you can still log in to your account you have not been hacked, and this route won’t sort it.  But if you have been hacked, the Facebook Access Team will assist you in getting your account back in your name and your control.

“I don’t like this post”

“I don’t like this post”

Personally I find some of the language used on the Facebook report algorithm rather unhelpful. “I don’t like this post” is a woolly and unhelpful option that is pretty much meaningless until you clarify WHY you dislike it – and there are many shades of grey between ‘this is a bit spammy’ and ‘I am worried someone is going to harm themselves imminently’.

You can find a route through the menus to the more serious options which are under the control of the Facebook Safety Team, but I sometimes wonder why there isn’t a more clear and present route to the more serious and urgent things – such as reporting threats of suicide or violence.  Facebook does indeed have close partnerships and mechanisms for rapidly escalating ‘credible’ threats of this nature to local law enforcement and emergency services around the world, but those flagging the issue have to wade through a number of options to get there – I would prefer to see a clear option on the first screen after hitting the report button asking ‘is someone in danger right now’ that cut straight to this bit.

Facebook’s help screens are very clear however, if you are aware of someone at risk of direct harm either from themselves or a third party, your responsibility anywhere in the world remains to contact local emergency services fast.  This makes perfect sense, no Facebook call centre is going to be able to respond as quickly as a call to 112, and if the emergency services require Facebook’s help to identify or locate people involved you can be sure they have legal mechanisms to get that assistance fast.  No one in a Facebook call centre is going to go round to the house of the friend you believe is about to hurt themselves or is threatening to hurt someone else, law enforcement and emergency medical services are geared up for the fastest response.

There are also a lot of details hidden away within the Facebook safety centre about specific kinds of support for different kinds of threats and dangers, such as a anti-bullying resources, support for veterans, and resources to support oppressed groups.

I used the word ‘hidden’ as most of Facebook’s very extensive support resources remain just that, they are not any part of most people’s daily use of the site and you can easily not be aware they are there.  You can access all of this content either through the Report button or simply via the drop-down menu in the top right corner of every Facebook page – but a valuable resource to bookmark is the Facebook Safety Centre https://www.facebook.com/safety which connects directly to all the key resources in this area – this is something else I think should be more prominently available on the site.

In fact if you are trying to find the fastest route to the right Facebook support resources whether for safety or anything else, Googling it is often the fastest route to what you need – the content is very well indexed.

But going back to that ‘report’ button, besides the Safety and Account recovery teams, there are two other Facebook teams involved in the Report process, to whom you may get routed – the Abusive Content team which handles scams, spam and sexually explicit stuff, and also the Hate and Harrassment team which deals with issues such as hate speech.  Depending on the responses you give to the question tree in the report screen, your feedback will be escalated to one of these groups, who will review the content and then feed back their response.

We will take a closer look at their roles and responsibilities next week.

Costa Connected, for Costa Blanca News, September 5th 2014

©Maya Middlemiss,

Casslar Consulting SL

Share →