We all have more or less bad days.What is not normal is that a social network like Facebook wants to experiment with human emotions causing, by the way, that we have an expressly negative day.You may be surprised, but only a few days ago a unique experiment started by the famous company that expressly altered the content of the New Feeds of 690,000 users came to light, in order to study the psychological reasons why we, the users, tend to update our states with emotional content.
Surprised? Let's look a little more on the subject.
Facebook's experiment on our emotions
How are you today? What are you thinking? Every time we open the famous social network we find these persistent questions on our wall.And yes, for many it is inevitable to answer, while giving a slight review of the updates in the news section , seeing comments from friends, acquaintances and various advertisements that Facebook itself puts us in its advertising section.
But what would happen if one day you would find only negative news, posts of tragic connotation, desperate or fatalist? Surprising as this may seem to you happened in 2012 , at the beginning of the year, just when about 690,000 users started receiving negative posts.But yes, there were also many of them who only saw positive news , since the experiment was trying to obtain data from both sides.
It was an experiment conducted by researchers from Cornell, the University of California, San Francisco and Facebook itself, which was intended to delve into the behaviors of users according to the emotions that will determine them.The data of this undercover experiment by the famous Mark Zuckerberg company have recently been published in the academic journal Proceedings of the National Academy of Science .And this is how those responsible for it have been explained:
[quote_box_center] The reason we conduct this research is that we care about the emotional impact of a re d as important as Facebook in people who use our product.We felt it was important to investigate how seeing friends updating posts with positive or negative content could lead some people to feel bad, or left out, or On the contrary, at the same time, we were worried about the possibility that negative messages from other friends may push us to stop visiting Facebook.[/quote_box_center]
And what were the results? of the study? Well, it was discovered that all those users exposed to more negative content, lowered the number of their own publications on the network.That is, they used less Facebook, and if they did, they were indeed not very encouraging content.While those who received only positive information, of course, were more optimistic.
So, Facebook achieved what it wanted: to change the emotional state of its users, it is true that it was on a small scale, but had significant implications for his study.What this social network usually does is to apply an algorithm to determine which of the approximately 1,500 posts available will be displayed on your news wall, that is, mix ads, personal stories and news from others users on purpose according to the users profile.Really amazing.
After learning about this Facebook experiment and how I got to manipulate the emotions of a small group of users, an infinite number of critical voices have been raised.demands that the social network with more power follow the standards of scientific ethics when conducting their investigations, asking at least if each of us wants or not to be part of any study.
The response and defense by Mark Zuckerberg has not taken long to appear: "Every time users access these programs, the company is free to apply any product improvement schemes,"
Do you think this is ethical? What is your opinion?
Image: mkhmarketing, Robert Scoble
Comments
Post a Comment