Overstating and Understating the Influence of Facebook

What does it take me to fire this blog up again? People going on Twitter to share links to blog posts discussing the ethical ramifications of Facebook’s “emotional manipulation” study. If you have read my prior postings, you may notice that one of my blogging interests is examining the process of persuasion.

Because persuasion and media’s influence on public opinion is a rare topic in sociology, I hope I can use my interdisciplinary background to help explain how the concerns over Facebook’s experiment are simultaneously overblown and understated. Just to warn you now, this is a long post.

Facebook’s potential power and influence come from the use of a “News Feed,” an algorithm designed to filter out at least some postings while highlighting others. The news feed does not delete my posts, but it may hide at least some of my posts from my friends’ pages, without their knowledge. Facebook’s experiment is alarming not because of the results (which were minimal). Facebook raised alarm bells because it showed the corporation is willing to alter its news feed algorithm in order to try and manipulate its users in some way. This raises an important question, which others have not discussed:

Why do we put up with the news feed algorithm in the first place?

If we think of Facebook as a “news” or “media” company, then the existence of a news feed is easier to understand. Every news organization uses some sort of filtering criteria to decide which events will be treated as news and which events are not “newsworthy.” These decisions are inevitable. Newspapers and television broadcasts have limited space for publication. The Internet removes most space limits because server space is cheaper, but there is still limited labor. With limited labor, we can accept that news organizations have to pick and choose which events to feature and which ones not to.

If we are interested in the news, we can have at least some idea of which news organizations will tend to feature or ignore which topics. For example, anyone here in Los Angeles knows local TV stations will drop anything for a car chase. The Los Angeles Times will split its attention between local, state and national issues but won’t emphasize car chases. Websites specialize, so their selection criteria are easy for readers to understand. We may not fully understand how news organizations make decisions – even those of us who study these decisions for a living – but we understand decisions must be made.

Facebook does not have to make the same filtering decisions as media companies media companies that produce their own content. Facebook has the server space to store all of our posts somewhere (along with a scary amount of user information). We do the labor of producing content. Facebook could share everything on our news feeds without using an algorithm. So why not protest Facebook’s News Feed algorithm and demand that everyone gets shown every post, in strict chronological order? Facebook used to give this option, but when I taught social networks last year my students had no idea this ever existed.

Change to a new, non-filtering news feed would probably disorient users more than any of Facebook’s other changes. As news consumers, we are all used to having someone else do some of the tough work of sorting through a wide range of potential stories to pick out some of the highlights. With no algorithm, we would have to do this filtering every time we use Facebook. Most of us probably do some of this filtering in our heads anyway, but we would have to do much more work without a News Feed algorithm, sorting out the “newsworthy” posts from things we don’t care as much about.

Why is “newsworthiness” different on Facebook?

On Scatterplot, Dan Hirschman highlighted the oddity of Facebook calling their algorithm a “News” Feed. What is “newsworthy” is a notoriously difficult question, even for journalists. In my own research, I find journalists often rely on how a news event is planned in advance as a sign of its importance, instead of judging events based solely on what gets said at the event or the logistical difficulty of coverage the event. The question of what is “newsworthy” is even more ambiguous on social media, because different people can use the same site for different purposes. Some people may use Facebook in a similar way to journalists, sharing information about the wider social world. Other people may use Facebook to share pictures of their babies. Baby pictures may not be interesting to a large number of people, but for many of us sharing our personal stories can be the most important “news” of the day.

If Facebook is going to do something to filter which postings are going to appear on our news feed, they have to balance between a wide range of competing interests among users. More than any of us, Facebook’s employees are probably keenly aware of which users employ which standards of newsworthiness. (I assume my profile is sitting on a server somewhere.) As Jenny Davis wisely points out, some of these standards are essentially hard-wired in to the Facebook interface. The presence of a like button and the relative difficulty of expressing certain emotions like sadness will bias Facebook content in favor of positive emotions. However, many aspects of Facebook’s News Feed algorithm are proprietary and constantly shifting. Roughly a year ago, Facebook made major changes to how it treated posts from media companies, as compared to individual’s posts (particularly user-generated meme graphics). Sites like Upworthy surged as they took advantage of Facebook’s new algorithm.

By this point, the fundamental ambiguity of “newsworthiness” plus the proprietary nature of Facebook’s News Feed algorithm leads us to four basic points:

  1. People probably prefer having a News Feed algorithm that filters posts than seeing everything in strict chronological order.
  2. There is no universally “correct” way to program a News Feed algorithm, because people want different things.
  3. Any News Feed algorithm can create winners and losers – a power Facebook wants to keep vs. giving us the keys.
  4. We know less about how Facebook screens out posts than we do with nearly any other media organization.

Are alarm bells going off in your brain? They should!

Facebook could have considerable power by manipulating the News Feed in various ways. What’s worse, the things we know about traditional journalism and how news spreads suggest that most people are more than happy to rely on others to sort through potential news stories and pick out the highlights. Most of the concerns about Facebook have been shared via blogs and Twitter. Now think of your friends who are active or semi-active on Facebook but don’t use Twitter? Are they concerned about Facebook? Mine aren’t. So what can Facebook do with this power, which many users are more than happy to tacitly consent to? Here’s Zeynep Tufekci:

To me, this resignation to online corporate power is a troubling attitude because these large corporations (and governments and political campaigns) now have new tools and stealth methods to quietly model our personality, our vulnerabilities, identify our networks, and effectively nudge and shape our ideas, desires and dreams. These tools are new, this power is new and evolving. It’s exactly the time to speak up!

That is one of the biggest shifts in power between people and big institutions, perhaps the biggest one yet of 21st century. This shift, in my view, is just as important as the fact that we, the people, can now speak to one another directly and horizontally.

… [Snip 1 paragraph]

I’ve been writing and thinking about this a lot. I identify this model of control as a Gramscian model of social control: one in which we are effectively micro-nudged into “desired behavior” as a means of societal control. Seduction, rather than fear and coercion are the currency, and as such, they are a lot more effective.

Go read the whole article at Medium. No one has a monopoly on relevant knowledge to bring to this discussion. My initial goal in writing this post was to address some of the assumptions of Tufecki’s Gramscian model, along with Hirschman’s questions at Scatterplot about how news and newsworthiness operate. After spending years working with people in political communication, it is clear how little sociological theory has progressed on the issue of persuasion and public opinion as compared to other social scientists. Tufecki’s Gramsian model proposes that individuals are extremely susceptible to large organizations with access to large amount of user data and some access to digital media to utilize that data. As Tufecki argues:

Today, more and more, not only can corporations target you directly, they can model you directly and stealthily. They can figure out answers to questions they have never posed to you, and answers that you do not have any idea they have. Modeling means having answers without making it known you are asking, or having the target know that you know. This is a great information asymmetry, and combined with the behavioral applied science used increasingly by industry, political campaigns and corporations, and the ability to easily conduct random experiments (the A/B test of the said Facebook paper), it is clear that the powerful have increasingly more ways to engineer the public, and this is true for Facebook, this is true for presidential campaigns, this is true for other large actors: big corporations and governments.

How can people resist a large actor with this much power?

Unfortunately, this is where Tufecki’s argument (and many other arguments about the power of Facebook) over-state their claims. The implication of her argument, like Gramsci and a host of other critical theories in communication, is that people have relatively few resources to resist the subtle and corrupting influence of corporate power. However, Tufecki and a wide range of other critics are not simply accepting Facebook’s power. We are not blank slates. We all interest new political information based on our prior held beliefs. Poll people after a debate, and partisans on both sides will tend to say their candidate won the debate. Ask people to look at a news story, and people with strong opinions on any side will tend to feel that the story is biased against them. Ask people what they think of Facebook’s study, and their response will probably be based on their opinions about corporations, the use of power and data privacy.

The largely negative reactions to Facebook’s study show how difficult it is to change people’s minds, particularly when they have already formed their opinions on a topic. Consider one of the more common arguments downplaying the outrage over the Facebook study: “So the meaningful question is not whether people are trying to manipulate your experience and behavior, but whether they’re trying to manipulate you in a way that aligns with or contradicts your own best interests.” Yarkoni goes on to argue that tech companies need regular experimentation to improve their user experiences, and these could be real benefits. I can see both sides of the argument. Having some version of a News Feed algorithm probably does align with most users’ best interests, insofar as users want to avoid the cognitive effort of sorting through feeds to find the postings they care about most. On the other hand, years of reporting experience have taught me that just about everyone who produces information for public consumption is trying to manipulate you in some way. That’s what makes us angry and unwilling to trust!

For all that manipulation of information could be subtle, people are very sensitive to new information that does not fit their prior held beliefs. We aren’t cultural dopes. The number of websites for someone to post reactions and critiques has grown exponentially. Facebook sharing plays an increasingly important role in spreading news stories and blog posts, but it does not have a monopoly on social media sharing or traditional media content (at the moment). As a result, there are ways for critiques of Facebook to spread to a large segment of the audience. Any direct or overt censorship of particular causes is very risky, because audiences have the ability to find the censorship and tell others about it, like this blog post and the others I have linked to.

“Facebook Manipulates Users’ Emotions” is a great headline that prompts people to think of a lot of nightmare scenarios. However, the emphasis on stealthy, subtle emotional manipulation makes it hard for people to understand the most powerful and plausible effect of Facebook’s News Feed algorithm: the ability to influence which topics we think are worthy of debate.

What is “agenda setting?”

Convincing people to think a particular way about a particular issue can be extremely difficult, but there are other things Facebook could do to influence public opinion. Several people have already discussed a different Facebook study, where giving people information about where to vote and showing other friends had already logged in to say they voted led to an additional 340,000 votes being cast in 2010. Selectively posting these messages in particular parts of the United States could influence elections, but they would probably cause irrecoverable harm to Facebook’s reputation with users.

The easiest way to influence people is to convince them that a particular issue is important. When people see a particular topic repeatedly in the news, they are more likely to think that it is important. This is known as “agenda setting.” Setting the agenda is a limited influence. Getting you to think about something isn’t as powerful as getting you to agree with me on a particular policy. However, campaigns know agenda setting is important and they try to define the agenda by emphasizing particular issues. Since they often rely on news organizations for publicity, they often fight with news organizations for control of the agenda. As people increasingly share political ideas online, it is unclear whether people respond to the agenda they see on social media in the same way as traditional media. However, Facebook’s News Feed could have tremendous power to set the agenda, as I describe with the three following hypothetical initiatives:

  1. Facebook brings the world closer together. Imagine Facebook wanted to try a World Cup themed branding exercise by saying it wanted to make it easier for users to follow what is going on across the world. This could be presented a public service as well (albeit a public service many users wouldn’t want). If everyone’s news feed is altered to emphasize various kinds of international stories, then foreign policy could become a more salient issue. Americans currently prefer Republican positions on foreign policy. Therefore, an increase in international coverage in people’s news feed could hurt Democrats in the fall. Alternatively, an increase in international coverage could highlight the humanitarian crisis in Central America and the sudden influx of undocumented children migrating north. Increased attention would make migration a more salient issue for voters. Given Facebook’s demographics, users are more likely to favor Democrats’ proposals for migration reform and be frustrated with Republicans’ lack of movement on the issue.
  2. Facebook connects people to non-partisan guides to candidates’ recent accomplishments. Assume Facebook wanted to make sure every user was better informed by connecting them to a highly reputable non-partisan guide listing what candidates have done in prior elected office. This could be great PR for Facebook, particularly if the company wants to alleviate concerns that they have the power to swing elections. But this service would also swing elections. Besides reminding a disproportionately Democratic voting base about the election, a guide about candidate’s recent accomplishments could suggest that the goal of government is to do things. The guide may remind people about the Republican party’s reputation as the party of no, strengthening some users’ dislike of the Republicans. (This is “priming” more than “agenda setting.”)
  3. Facebook cares how you feel about the issues. Since Facebook knows users dislike non-emotional posts, they could skew the news feed to emphasize people adding emotional reactions to stories they share as opposed to direct, non emotional storytelling. This would change the news feed to increase the weight of friends sharing stories, particularly if they add comments which include a specified set of emotional terms. A mechanical counting algorithm wouldn’t get in people’s heads as they write on Facebook, but it’s good enough. This algorithm would increase the salience of issues that tend to provoke emotional responses, such as race, gender and religion. Other issues would be less likely to appear in our news feeds.

In many ways, the third hypothetical may be close to what Facebook has been doing for the past year. Facebook’s algorithms favor sites like Upworthy, which emphasize emotional levers in click-bait headlines. They also give a subset of users what they appear to want: a place to share emotional reactions about political events and receive validation from friends. Even without Facebook, some of my ongoing research suggests bloggers may prefer copying phrases from new media sites that surround the phrase with emotional reactions to traditional media elites that couch phrases in the rituals of objectivity.

In 2008, John McCain’s campaign manager Rick Davis said “this campaign is not about the issues.” Washington insiders were stunned by the “gaffe.” Thanks to Facebook’s News Feed and what many users prefer as political news, Davis may have been a prophet, anticipating a world where social media helps users set the agenda for elections based on their emotional responses to identity politics.

Advertisements

About Noah Grand

PhD in Sociology. I use statistics to predict news coverage. And home runs. View all posts by Noah Grand

8 responses to “Overstating and Understating the Influence of Facebook

  • grahamalam

    Fantastic. Really thoughtful.

    And that’s a real question: are we becoming more deliberative-brained, culture-creating, and so forth with the internet, or are we regressing to identity political tribalism and mechanical solidarity?

    At first blush, there appears to be a ton more mechanical solidarity building, outrage, emotional click bait, etc. on the internet than in real life.

    But this is potentially because people are climbing social walls and bumping up against social boundaries that they were not before, and that the *beginnings* of a deliberative, dispassionate, and reasoned conversation often start with a lot of screaming and tears.

    • Noah

      Great questions. I’d say yes to both: people are becoming more culture-creating with the Internet (or maybe we just have a lot more tools to share the culture we create). At the same time, there are incentives for people who want to use the Internet to engage in political tribalism and homophily.

      It’s also important to remember that the idea of a high-minded, deliberate democracy has always been more of an ideal than an accurate description of American political history. As nasty as modern politics can be at times, we don’t have politicians shooting each other in duels anymore.

      • grahamalam

        That’s exactly right about the Greek ideal of political engagement being an ideal. That got turned up full volume during the Enlightenment, and then sociologists realized, “whoops, people aren’t nearly rational like that!”

        But I think it’s still a useful ideal to work towards, and something that does become more real as the social division of labor increases, technology gets cheaper, and human capital endowments increase.

        I always relate the internet to one of Elinor Ostrom’s collective action studies. The water table was disappearing in some California town. People got together to talk about it, and at first they swore at each other or whatever, and then finally calmed down and hammered out a collective action agreement.

        I think people look at Youtube comments and think “wow the internet is making us all stupid and despicable,” but really that’s just the beginning of a conversation that was silent before and is trending upward….

  • Brainwashing Is Not A Thing

    […] Grand, who knows much more about news media, culture, and sociology than me, has a fantastic post on these matters at his […]

  • Gilbert

    Very thought-provoking. It will be fascinating to see if any of those initiatives happen in some form. And the comparison of Facebook to a media company was interesting, especially since so many media outlets try to use Facebook to boost their own audiences.

  • OK Cupid and Some Limits on Outrage | Science of News

    […] In terms of comparing the two sites, I think there are major limitations. OK Cupid is well known for blogging about user experience and patterns of user behavior. In a market with a wide range of dating websites, these types of posts are a major part of how they differentiate themselves in the market and draw attention to the site. Some level of caveat emptor applies. A majority of users may not have been aware of how Facebook’s News Feed algorithm hides various posts, so it’s harder for the consumer to be aware of the risks. (See my prior post on Facebook here) […]

  • #Ferguson on Twitter and Facebook | Science of News

    […] posts on a highly emotional issue like Ferguson. Emotional posts attract more participation. Remember how Facebook experimented on our News Feeds to test this theory? Regardless of whether Facebook looked to specifically alter the frequency at which posts on […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: