As the world buckles under the weight of forced economic shut downs, major civil unrest, and massive social dissonance, there seems to be a glaring cause to what has been causing all these wild fluctuations in society’s collective adaptability – Social Media.
Yes, we already somewhat know of the problem, but most forget what force in the past 10 years has constructed our current cultural derangement. It was around the time the phenomenon known as Social Media was being build and tested. The idea of social media was still new 10 years ago. In 2009 when Facebook started to gain the largest userbase on the planet, the concept and inevitable side effects of Social Media had not been discussed very much as a society, even though in the days of Myspace, online stranger danger was a common concern to regular people. But Facebook found itself in the most opportune time in technological history while helping usher in the age of Big Data, a new phenomenon surrounding the ever increasing size of centralized user data. Used legally, according to the user terms of service, they were able to monetize their invention and grew at a rate that went unnoticed until recent years.
At first the public wasn’t very aware of the kinds of data the newly dubbed Social Media site was collecting. Funny enough, around 2014, an article was published by the Guardian that outlined something Facebook had done unbeknownst to their users. What should have raised serious concerns at the time is almost the tolerated norm in 2020, and no one seems to remember what exactly sparked a lot of this collective unease. We know of the algorithms that feed us curated content and the shadow profiles of people who have never signed up to the site, but the answer is more subversive that what we remember.
In 2014, the Guardian wrote:
Facebook says the huge psychological experiment it secretly conducted on its users should have been “done differently” and announced a new set of guidelines for how it will approach future research studies.
In a blogpost on Thursday, Mike Schroepfer, chief technology officer, said the company had been “unprepared” for the negative reactions it received when it published the results of an experiment in June.
Facebook published the results of a 2012 study in the Proceedings of the National Academy of Sciences. Unbeknown to users, Facebook had tampered with the news feeds of nearly 700,000 people, showing them an abnormally low number of either positive or negative posts. The experiment aimed to determine whether the company could alter the emotional state of its users.
“The experiment aimed to determine whether the company could alter the emotional state of it’s users.”
When I first read this article, it wasn’t as obvious what the point of this experiment was. While nefarious sounding, it was still a new concept and the full understanding of this new technology’s influence on a society was unknown.
After 6 years, however, I can assume the tech giant kept experimenting with the technology to the point where they could manipulate people into an emotional frenzy. The billion dollar company has the power to alter a user’s emotional state, unknown to the user, by using an algorithmically served News Feed to manipulate what content they see and what order to see it in. One amazing aspect of Big Data is the ability to use artificial intelligence and utilize newly developed kinds of AI technology at the same time – machine learning.
This wasn’t the only experiment however. They also conducted other experiments that were made public by Forbs in 2014 as well. As reported, the researchers, led by data scientist Adam Kramer, found that emotions were contagious. “When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,” according to the paper published by the Facebook research team in the PNAS. “These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
The experiment ran for a week from January 11th to the 18th in 2012, during which the hundreds of thousands of Facebook users unknowingly participating may have felt either happier or more depressed than usual, as they saw either more of their friends posting ’15 Photos That Restore Our Faith In Humanity’ articles or despondent status updates about losing jobs, getting screwed over by X airline, and already failing to live up to New Year’s resolutions. “*Probably* nobody was driven to suicide,” tweeted one professor linking to the study, adding a “#jokingnotjoking” hashtag.
They also note that when they took all of the emotional posts out of a person’s News Feed, that person became “less expressive,” i.e. wrote fewer status updates. So prepare to have Facebook curate your feed with the most emotional of your friends’ posts if they feel you’re not posting often enough.
Another experiment includes researchers looking at over 200,000 photo comments posted to the site with Snopes.com links. Snopes is an inconsistent rumor-debunking site so would indicate the shared photo was an example of the assumption of someone being duped and then erroneously passing it on to their friends, such as people claiming this guy was Trayvon Martin at 17 or that Obamacare would tax non-medical items like clothes and rifles (even though the debunk was debunked). They then looked at how viral those photos went.
They also studied users who claimed “Facebook Offers” — such as an offer for free lace panties from Victoria’s Secret — and were put into two groups. One group had the offers they claimed auto-shared so that friends would see it in their News Feeds. People in the other group were graciously given a button to click if they wanted to broadcast the offer claim to their friends.
And in 2010 they offered test subjects an ‘I Voted’ button at the top of their News Feeds and information on how to find their polling place. Some users also saw the names of their friends who had clicked the button. The control group got no prompt to vote. Then the researchers checked public voting records to see which of the millions actually voted.
This experiment found that peer pressure works. People were more likely to click the “I Voted” button if their friends’ names appeared there. When researchers checked actual voting records, they found that people who got the “I Voted” message in their News Feed were 0.39% more likely to have actually voted, and were more likely to have voted if their friends’ names appeared. Those are miniscule percentages but the researchers think their experiment resulted in 340,000 votes that wouldn’t have otherwise happened.
The product of this manipulation over the past few years has, in my opinion, led to unchecked, radical activism brought on by conspiracies and toxic ideologies such as Critical Race Theory. The extremism that we’ve seen the past few years has led to radicals organizing on these platforms without any moderation. It has led to what used to be coherent social media activists losing it on election night. The PNW Youth Liberation Front, a far left activist group associated with ANTIFA, sounds like they’ve had a psychotic break. And this isn’t the only person acting out.
Even AOC, a sitting member of the House, is actually calling for lists to be made of Trump’s supporters. Something you only see in communist China or Stalin’s Russia. There is even a website dedicated to making that list.
The experiment was designed to assess whether more positive or negative comments in a Facebook newsfeed would impact how the user updated their own page. Facebook used an algorithm to filter content. Researchers found those shown more negative comments posted more negative comments and vice versa.
“It is clear now that there are things we should have done differently,” Schroepfer writes. “For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.”
So while the country moves on from the election and the left continues to spiral into a psychosis, we need to work hard at informing them how they’ve been manipulated and influenced by these terrifying technologies. The emotional manipulation over years of mainstream media bolstering the narrative have swept half the world into a psychotic conspiracy filled fervor that won’t stop until the system itself collapses.
Sources of original stories: