• NEW! LOWEST RATES EVER -- SUPPORT THE SHOW AND ENJOY THE VERY BEST PREMIUM PARACAST EXPERIENCE! Welcome to The Paracast+, eight years young! For a low subscription fee, you can download the ad-free version of The Paracast and the exclusive, member-only, After The Paracast bonus podcast, featuring color commentary, exclusive interviews, the continuation of interviews that began on the main episode of The Paracast. We also offer lifetime memberships! Flash! Take advantage of our lowest rates ever! Act now! It's easier than ever to susbcribe! You can sign up right here!

    Subscribe to The Paracast Newsletter!

"Everything We Know About Facebook’s Secret Mood Manipulation"

Christopher O'Brien

Back in the Saddle Aginn
Staff member
[As Facebook gobbles up more & more of folk's time, this article hints at the possible scope of widespread cultural manipulation bought & paid for by your friendly PTB — chris]

By Robinson Meyer/The Atlantic

The Atlantic Article HERE:

Facebook’s News Feed—the main list of status updates, messages, and photos you see when you open Facebook on your computer or phone—is not a perfect mirror of the world.

But few users expect that Facebook would change their News Feed in order to manipulate their emotional state.

We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.

This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.

The experiment is almost certainly legal. In the company’s current terms of service, Facebook users relinquish the their data “data analysis, testing, [and] research.” Is it ethical, though? Since news of the study first emerged, I’ve seen and heard both privacy advocates and casual users express surprise at the audacity of the experiment.

We’re tracking the ethical, legal, and philosophical response to this Facebook experiment here. We’ve also asked the authors of the study for comment.

Author Jamie Guillory replied and referred us to a Facebook spokesman. Early Sunday morning, a Facebook spokesman sent this comment in an email:

“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

What did the paper itself find? The study found that by manipulating the News Feeds displayed to 689,003 Facebook users users, it could affect the content which those users posted to Facebook. More negative News Feeds led to more negative status messages, as more positive News Feeds led to positive statuses.

As far as the study was concerned, this meant that it had shown “that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” It touts that this emotional contagion can be achieved without “direct interaction between people” (because the unwitting subjects were only seeing each others’ News Feeds). The researchers add that never during the experiment could they read individual users’ posts. [Yeah, sure—whatever you say...]

Two interesting things stuck out to me in the study.

REST COVERAGE OF FB PSYOP HERE:
 
I don't trust the almighty technomasters who increasingly run our world. The only protection against them are even smarter technomasters whom we in turn cannot always trust. And its that tip of the iceberg thing again.

I recall having a friendly conversation with a stranger in a part of the American southwest infamous for for its spookyish activity. The guy was apparently retired from doing IT work for the government. He was vehemently down on social media and its long term implications for individual freedom. So much so that we sort of wrote him off as just a grumpy eccentric.

But lately, I'm not so sure.
 
[As Facebook gobbles up more & more of folk's time, this article hints at the possible scope of widespread cultural manipulation bought & paid for by your friendly PTB — chris]

By Robinson Meyer/The Atlantic

The Atlantic Article HERE:

Facebook’s News Feed—the main list of status updates, messages, and photos you see when you open Facebook on your computer or phone—is not a perfect mirror of the world.

But few users expect that Facebook would change their News Feed in order to manipulate their emotional state.

We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.

This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.

The experiment is almost certainly legal. In the company’s current terms of service, Facebook users relinquish the their data “data analysis, testing, [and] research.” Is it ethical, though? Since news of the study first emerged, I’ve seen and heard both privacy advocates and casual users express surprise at the audacity of the experiment.

We’re tracking the ethical, legal, and philosophical response to this Facebook experiment here. We’ve also asked the authors of the study for comment.

Author Jamie Guillory replied and referred us to a Facebook spokesman. Early Sunday morning, a Facebook spokesman sent this comment in an email:

“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

What did the paper itself find? The study found that by manipulating the News Feeds displayed to 689,003 Facebook users users, it could affect the content which those users posted to Facebook. More negative News Feeds led to more negative status messages, as more positive News Feeds led to positive statuses.

As far as the study was concerned, this meant that it had shown “that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” It touts that this emotional contagion can be achieved without “direct interaction between people” (because the unwitting subjects were only seeing each others’ News Feeds). The researchers add that never during the experiment could they read individual users’ posts. [Yeah, sure—whatever you say...]

Two interesting things stuck out to me in the study.

REST COVERAGE OF FB PSYOP HERE:

Been saying this for years "Get Away From Facebook" ..

10 Reasons To Delete Your Facebook Account | Business Insider

Just for starters and redpill yourself people by reading more... never had a facebook account and never will as I could see this shit coming a mile away.
 
I used facebook once for a total of two weeks at the insistence of old friends. When at least two dead people, people I knew had died, tried to "friend" me, I had enough, never looked back, haven't been interested since. Obvious NSA op is obvious.
 
I used facebook once for a total of two weeks at the insistence of old friends. When at least two dead people, people I knew had died, tried to "friend" me, I had enough, never looked back, haven't been interested since. Obvious NSA op is obvious.

This Is Why You Should Delete Facebook Permanently


FB Passive Listening: Passive listening will soon be a feature for Facebook app during status updates | Ars Technica
FB Reads Your Texts: Facebook Reading Android Users’ Texts? Well, Hold On | TechCrunch
NSA FB Servers: Snowden Docs Expose How the NSA "Infects" Millions of Computers, Impersonates Facebook Server | Democracy Now!
FB Privacy Listening Issues: Facebook Wants To Listen In On What You're Doing - Forbes
FB Stores Your Recordings: Facebook Microphone Update To Store Data: Social Media Giant Confirms New Feature Will Aggregate Information
FB Silently Updates TOS: Facebook to rip search opt-out from under those who were using it | Ars Technica
FB Organic Reach Throttled: Facebook Organic Reach Plummeting | Social Media Today
FB FRAUD : https://www.youtube.com/watch?v=oVfHe...
FB Leaks Millions of Private Messages: How Facebook Explains User Data Bug That Leaked 6 Million People's Information
FB and FBI: Cloudup
Reddit Thread on FB Listening: Facebook has been forced to defend a “creepy” new feature that allows it to activate your smartphone’s microphone and listen in. The feature turns on the phone’s mic and picks up on what is happening, such as music or a TV playing in the background. : technology
Users Slam FB Listening: Users slam ‘creepy’ new feature that allows Facebook to listen in | News.com.au
How Audio Fingerprints Work: http://goo.gl/ndMlyh
The Right to Privacy: Griswold v. Connecticut - Wikipedia, the free encyclopedia
 
I think I still have a FB account, but I didn't see what the big deal was, still don't. I haven't used it in so long it's probably deleted or something.
 
Is it even possible to delete your facebook account? I decided it wasn't, and didn't bother, but cleaned out all of the MALWARE facebook wrote to my registry and left strewn around in system folders.
 
Is it even possible to delete your facebook account? I decided it wasn't, and didn't bother, but cleaned out all of the MALWARE facebook wrote to my registry and left strewn around in system folders.

Funny reminds me of a friend who was worried about spyware on her computer .. has Facebook icon on desk top .. well there is your spyware problem.
 
Is it even possible to delete your facebook account? I decided it wasn't, and didn't bother, but cleaned out all of the MALWARE facebook wrote to my registry and left strewn around in system folders.
FB makes adding anything to your account easy but deleting or discontinuing anything a real PITA. I still have an account. But I only visit it when social protocol dictates. I post nothing these days. I would really like to DC it altogether.
 
This post by Chris prompted me to permanently delete my Facebook account just now.

I'm starting a new job on Monday. With no Facebook account, I won't have to deal with the inevitable slew of new friend requests from new co-workers trying to "find out more about me". I can start fresh.

Before deleting, I did employ this feature to download all my Facebook stuff...

How can I download my information from Facebook? | Facebook Help Center | Facebook

My download resulted in a 60 megabyte .zip file.
 
I have never had a FB account but Google pretty much has full access to my emails, docs, web history etc. i do what i can including zip encrypting my uploads to Drive.

But what i find funny here is that...at least from my understanding...was that FB wanted to determine if emotional contagion was a real thing, which if i heard correctly is pretty disingenuous. OF COURSE emotions are contagious. I see it nearly every week in my job. Whenever you get a group of people, and it doesn't have to be a large one, and one individual is experiencing strong emotions good or bad, you WILL see other people soak it up like a sponge and project those same emotions even though they were previously calm. I'd be willing to bet that even if you didn't give a rat's ass about sports if you watched yesterday's US Belgium soccer...excuse me, futbol...match you would have gotten sucked up by the surplus of emotions in the room. To undertake such an experiment in order to determine whether this phenomena exists is a joke, you would have to conclude this was done in order to go about manipulating moods.

Recommended reading; Outbreak: Encyclopedia of Extraordinary Behavior by Hilary Evans

Outbreak! The Encyclopedia of Extraordinary Social Behavior: Hilary Evans, Robert E. Bartholomew: 9781933665252: Amazon.com: Books

This along with The People's Chronology by James Trager are among my indespensible reference book reading.
 
Last edited:
What facebook did here is actually totally illegal in most countries around the world, if I am understanding this thing correctly. If psychology is a branch of medicine (I believe it is), then the Nuremburg Protocols on medical experimentation adopted by most of the world after WWII (but never enforced by the US) make it completely illegal to experiment on people without their knowledge and consent.

I believe facebook is liable to charges of war crimes and crimes against humanity for doing this stupid crap.
 
Of course it's illegal.

We live in a lawless society where business and government leaders are exempt from laws.

In a normal family, the leader/father sets the example of good conduct for those he leads. In a diseased, dysfunctional family, the father sets a bad example.

The long-term results of bad conduct are completely known and predictable.
 
[As Facebook gobbles up more & more of folk's time, this article hints at the possible scope of widespread cultural manipulation bought & paid for by your friendly PTB — chris]

By Robinson Meyer/The Atlantic The Atlantic Article HERE:

... As far as the study was concerned, this meant that it had shown “that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.” It touts that this emotional contagion can be achieved without “direct interaction between people” (because the unwitting subjects were only seeing each others’ News Feeds). The researchers add that never during the experiment could they read individual users’ posts ...

The idea of "emotional contagion" in the context of online text based communication is interesting. I would suggest that there is substantial room for what might be termed "false positives", which would be emotional contagions that emerge when the original text combines with the recipient's own emotional DNA and mutates into a new state that bears little resemblance to that of the original writer. Yet once such a contagion has emerged, it can then manifest into a full blown epidemic. And then of course let's not forget the Facebook security conspiracy theory ...

CIA Facebook Connection

 
Last edited:
Funny. Young people are abandoning Facebook in droves. Why? Facebook pays old people to spy on them...

Facebook hiring police to spy on students

"A report published in January of this year by iStrategy Labs determined that, in the three years since January 2011, the number of Facebook users between ages 13 and 17 had fallen by 25.3 percent. Users in high school over the same period fell by 58.9 percent and college students by an astounding 59.1 percent.

...If a youth in Menlo Park decides to skip school and broadcast the news on Facebook only to be visited by a police officer with a secret account, it’s hard to picture him being excited to share another sensitive piece of information about himself on the site again.

...Ferguson told the Journal she recently visited a boy’s home after using a secret account to view his Facebook,
"
 
Back
Top