Let’s look at two hypothetical examples of why this is important. That doesn’t mean it’s unimportant, though, as the authors add: The first? The effect the study documents is very small, as little as one-tenth of a percent of an observed change. Two interesting things stuck out to me in the study. The researchers add that never during the experiment could they read individual users’ posts. More negative News Feeds led to more negative status messages, as more positive News Feeds led to positive statuses.Īs far as the study was concerned, this meant that it had shown “that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” It touts that this emotional contagion can be achieved without “direct interaction between people” (because the unwitting subjects were only seeing each others’ News Feeds). The study found that by manipulating the News Feeds displayed to 689,003 Facebook users, it could affect the content that those users posted to Facebook. Kramer adds that Facebook’s internal review practices have “come a long way” since 2012, when the experiment was run. “Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone … In hindsight, the research benefits of the paper may not have justified all of this anxiety.” “And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it,” he writes. Kramer, one of the study’s authors and a Facebook employee, commented on the experiment in a public Facebook post. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.Īnd on Sunday afternoon, Adam D.I. We carefully consider what research we do and have a strong internal review process. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. In the company’s current terms of service, Facebook users relinquish the use of their data for “data analysis, testing, research.” Is it ethical, though? Since news of the study first emerged, I’ve seen and heard both privacy advocates and casual users express surprise at the audacity of the experiment. The experiment is almost certainly legal. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate them. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves. Some people were shown content with a preponderance of happy and positive words some were shown content analyzed as sadder than average. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. We now know that’s exactly what happened two years ago. Facebook’s News Feed-the main list of status updates, messages, and photos you see when you open Facebook on your computer or phone-is not a perfect mirror of the world.īut few users expect that Facebook would change its News Feed in order to manipulate their emotional state.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |