PRIVACY SHREDDING Facebook's third-party moderators can see far more user information than the social networking company has claimed, according to UK media reports.
Despite Facebook's claim that it only shares the actual content flagged and the source of the report with its third party moderators, The Daily Telegraph reports that they are able to see much more.
"However, new evidence seen by The Telegraph, shows that these moderators, who have to deal with the distressing images and messages which are reported every day, and are clearly able to see the names of the person who uploaded the 'offensive' content, the subject of the image or person tagged in a photo - in addition to the person who has reported the content," the newspaper reported.
"Moreover, there are currently no security measures in place stopping these moderators taking screen shots of people's personal photos, videos and posts."
The report's claims are based on information received from a former Facebook moderator, Amine Derkaoui from Morocco, who showed the newspaper "several screenshots of what these outsourced workers see" when assessing whether a post should be allowed.
Derkaoui claimed there was "no decent security at all" to maintain user privacy, adding that looking at a report offers the moderators as much user information as looking at a friend's facebook page.
Facebook responded that the third parties provide preliminary classification of a small proportion of reported content.
"These contractors are subject to rigorous quality controls and we have implemented several layers of safeguards to protect the data of those using our service," the company claimed.
"Additionally, no user information beyond the content in question and the source of the report is shared. We have, and will continue, to escalate the most serious reports internally, and all decisions made by contractors are subject to extensive audits." µ
Sign up for INQbot – a weekly roundup of the best from the INQ