Facebook has come under fire in recent weeks for an internal report leaked by the Australian which details how it has tracked the emotions of roughly 6.4 million young people. Facebook conducted research by monitoring posts made by high schoolers, college students and young workers in Australia and New Zealand to determine when they feel “useless”, “silly”, “stressed”, “defeated”, a “failure”, “anxious” and “overwhelmed”.
According to the Australian the report said, “Anticipatory emotions are more likely to be expressed early in the week, while reflective emotions increase on the weekend” and that “Monday-Thursday is about building confidence; the weekend is for broadcasting achievements”.
Facebook collected this emotional data and information including relationship status, number of friends and how often users access the site, as well as photos, posts and how users ‘react’ to certain content in order to create a profile for each person. Major concerns have been raised that Facebook could potentially use this information to allow advertisers to directly market to people in certain emotional states, much like how advertisers can already target users based on race, gender and age.
Facebook could potentially use this information to allow advertisers to directly market to people in certain emotional states, much like how advertisers can already target users based on race, gender and age.
Since the leak, Facebook has issued a couple of statements. The first said they had “opened an investigation to understand the process failure and improve our oversight” and a second called the Australian article “misleading” and claimed that they do not “offer tools to target people based on their emotional state”. Regardless many are still calling this research a major invasion of privacy.
When we agree to the terms and conditions of using a Facebook account we allow our posts to be monitored, however we often expect this to relate more to the prevention of spreading of hateful or harmful content, rather than allowing us to be unknowingly psycho-analysed. While Facebook seems to be sharing information of potentially little consequence, many feel that further consent should be given by users before Facebook shares this with a third-party, particularly as much of the data is related to children aged 14-17.
There are however some potential benefits of this information being shared to advertisers. It could assist charities and companies that offer counselling or want to spread mental health awareness to target people who may need help most. Although, others are worried about the system being exploited, with opportunistic companies pursuing those in society at their most vulnerable.
It could assist charities and companies that offer counselling or want to spread mental health awareness to target people who may need help most.
While Facebook has said it is not currently offering the ability to target users based on emotions, this doesn’t mean it’s entirely off the cards, especially given the company’s past misdemeanours. After all, Facebook declined to rule out that research was not carried out in territories other than Australia and New Zealand. This might mean that there’s no need to voice our concern- they may already know exactly how we feel.