From time to time, someone asks me where my current interest in equality, inclusivity and diversity comes from. (See my previous LinkedIn article.)
Facebook appears to know it is in trouble over the experiment it conducted (see previous post). On CNN, I read this morning that a spokesperson said it was research “to improve our services”.
It looks like Facebook is trying to jump through hoops. But Facebook doesn’t fit through the hoops.
When users consented to their data being used to improve Facebook’s services, most users will have assumed that this referred to services provided to the users, not services Facebook provides to advertisers. (When you’re happy, you are more optimistic, hence more likely to click on advertisements. Pessimists have a more realistic view of the world than optimists, but optimists likely see themselves as more successful than pessimists.)
And when Facebook users consented to their data being used to improve the services, they sure as hell did not consent to psychological experiments being conducted on them.
They may have expected Facebook to analyse the data and make use of the results of those analyses, yes, but they were likely thinking in terms of technology or something along those lines. Upgrading server x that delivers Facebook to country y. They may also have expected to see baby products being advertised to those who clicked on such ads and posted baby pictures, and office products being shown to people who stated that they are self-employed.
Facebook tweaking the streams of users to bring them the items it thought users wanted to see, that is one thing. I can be annoyed about Facebook not showing my friends’ posts in my timeline, no matter how many boxes I tick to try and get them to show and I can be annoyed about commercial posts I get shown no matter how many boxes I tick in an attempt to get rid of posts about products I cannot even buy because I am many miles away on the other side of the world, but that is an entirely different ballpark compared with Facebook deliberately tweaking the streams of users to make them feel happy or make them feel miserable, or even attempting to see whether it can or not.
Facebook – and the two university researchers along with it – has crossed a line, again. This time, Facebook has made an unforgivable mistake.
It is true that other media manipulate us all the time. But we expect that. We know that the BBC only reports what it wants to report and does not present an objective overview of society. We know that commercials feed us bullshit, that buying that car or buying that dress or perfume won’t make glamorous models suddenly find us irresistible. And I know that when CNN – CNN Money, that is – writes that “it does not appear that Facebook faces any legal implications”, CNN is trying to manipulate its audience too.
That does not apply when it comes to messages from our friends. It may still be true that we have one or two friends – or children – who may consciously or subconsciously try to manipulate us, but when it comes to messages our friends post combined, we do not expect those messages to be manipulated by a third party in such a way that we become happier. And we certainly don’t expect our Facebook streams to be manipulated to make us miserable.
Facebook could have conducted this experiment equally well after explaining what it wanted to do and allowing users informed consent. It chose not to.
The US Army provided some of the funding for this experiment. That does not help.
I have meanwhile realised how Facebook may be able to get away with this in a court of law. Facebook could claim that it was carrying out this experiment because it was concerned about the number of suicides and other problems precipitated by bullying on Facebook. It could say that it was trying to figure out how it could tweak the streams of its users to prevent such problems for its users. Unless some whistleblower provides evidence to refute this, that might very well work.
I just learned that Facebook made the blunder of conducting a massive psychological experiment on the users of its English-language version without their explicit consent. This is extremely unethical.
This is bound to have legal consequences.
I hope to see class actions in every country that uses the English version of Facebook because this is most definitely not right. No amount of word-twisting by Facebook (or the researchers) can cover up that no users ever consented to this kind of experiment being carried out on them.
In addition, the university researchers involved in the study should be investigated and disciplined. If they were in my employ, I would sack them instantly.
They have damaged the scientific reputation of their universities and, in my view, do not belong in academia. I trust that Cornell University and the University of California will take the appropriate steps.
On the other hand, these researchers are highlighting a serious danger that lurks behind social media, but it does not appear that this was the motivation for their unforgivable conduct.