By Juliano Andrade Spyer, on 28 June 2013
Do you think that the company that owns the social networking site is using your data for purposes that harm you?
This is the one question from a demographic survey our team is currently piloting that makes my informants stop and think. Only one person – out of five, so far – have answered “yes” to it, and it was an emphatic “yes”. But the others were convinced that there could not be such thing. And the rationale behind these answers seems to be: – Why would a company want to harm its clients?
These optimistic informants also add that everything they do or say on Facebook, the most popular social networking site in Brazil, is both true and already known by others so there is no possibility their posts could produce any harm.
This particular topic came back to my mind earlier this month as Facebook was accused of political censorship here in Brazil.
“Dilma Bolada” is the name of a fictional character inspired on the personality of Brazil’s current president, Dilma Rousseff. The author of the character, a 23 year old college student, has attracted 350,000 followers on Facebook alone.
His character mixes the (strong) personality of the President with a relaxed tone typically used here on social networking sites to comment on politically-related news stories. As a result, following Dilma Bolada is like being part of circuit of close friends with access to what could be Dilma’s Facebook posts.
At the end of May, Facebook erased a post published on Dilma Bolada’s page mentioning an alleged involvement of Aécio Neves on a corruption scandal. Neves will likely be one of her strong opponents on the next presidential race.
The post was based on a news report published by a known magazine and, as the author of Dilma Bolada wrote on a later post, its content did not break Facebook’s user policy.
Folha de São Paulo, an important national newspaper, picked up the story of an apparent act of censorship by Facebook to protect a politician. Other news venues followed.
At first Facebook refused to comment on the decision, but the repercussion both online and through the press pressed the company to revoke the decision and to issue an explanation.
According to Facebook, an automatic feature that detects harmful content was responsible for the deletion. The system bases its decision on feedback sent by users. The company explained that, in this case, the system did not operate correctly.
The erased post was also reestablished – here, in Portuguese – on Dilma Bolada’s timeline, but the question that remains is: would Facebook have changed its decision if the problem had not drawn so much attention and noise?