The rise of Facebook and 'the operating system of our lives'

The rise of Facebook and 'the operating system of our lives'
Siva Vaidhyanathan, UVA’s Robertson Professor of Media Studies, is the director of the University of Virginia’s Center for Media and Citizenship. (Photo by Dan Addison)

Recent changes announced by social media giant Facebook have roiled the media community and raised questions about privacy. The company's updates include a higher level of news feed priority for posts made by friends and family and testing for new end-to-end encryption software inside its messenger service.

As Facebook now boasts more than a billion users worldwide, both of these updates are likely to impact the way the world communicates. Prior to the company's news-feed algorithm change, a 2016 study from the Pew Research Center found that approximately 44 percent of American adults regularly read news content through Facebook.

UVA Today sat down with Siva Vaidhyanathan, the director of the University of Virginia's Center for Media and Citizenship and Robertson Professor of Media Studies, to discuss the impact of these changes and the evolving role of Facebook in the world. Naturally, the conversation first aired on Facebook Live.

Q. What is the change to Facebook's News Feed?

A. Facebook has announced a different emphasis within its news feed. Now of course, your news feed is much more than news. It's all of those links and photos and videos that your friends are posting and all of the sites that you're following. So that could be an interesting combination of your cousin, your coworker, the New York Times and Fox News all streaming through.

A couple of years ago, the folks that run Facebook recognized that Facebook was quickly becoming the leading news source for many millions of Americans, and considering that they have 1.6 billion users around the world, and it's growing fast, there was a real concern that Facebook should take that responsibility seriously. So one of the things that Facebook did was cut a deal with a number of publishers to be able to load up their content directly from Facebook servers, rather than just link to an original content server. That provided more dependable loading, especially of video, but also faster loading, especially through mobile.

But in recent weeks, Facebook has sort of rolled back on that. They haven't removed the partnership program that serves up all that content in a quick form, but they've made it very clear that their algorithms that generate your will be weighted much more heavily to what your friends are linking to, liking and commenting on, and what you've told Facebook over the years you're interested in.

This has a couple of ramifications. One, it sort of downgrades the project of bringing legitimate news into the forefront by default, but it also makes sure that we are more likely to be rewarded with materials that we've already expressed an interest in. We're much more likely to see material from publications and our friends we reward with links and likes. We're much more likely to see material linked by friends with whom we have had comment conversations.

Credit: University of Virginia

This can generate something that we call a "filter bubble." A gentlemen named Eli Pariser wrote a book called "The Filter Bubble." It came out in 2011, and the problem he identified has only gotten worse since it came out. Facebook is a prime example of that because Facebook is in the business of giving you reasons to feel good about being on Facebook. Facebook's incentives are designed to keep you engaged.

Q. How will this change the experience for publishers?

A. The change or the announcement of the change came about because a number of former Facebook employees told stories about how Facebook had guided their decisions to privilege certain things in news feeds that seemed to diminish the content and arguments of conservative media.

Well, Facebook didn't want that reputation, obviously. Facebook would rather not be mixed up or labeled as a champion of liberal causes over conservative causes in the U.S. That means that Facebook is still going to privilege certain producers of media – those producers of media that have signed contracts with Facebook. The Guardian is one, the New York Times is another. There are dozens of others. Those are still going to be privileged in Facebook's algorithm, and among the news sources you encounter, you're more likely to see those news sources than those that have not engaged in a explicit contract with Facebook. So Facebook is making editorial decisions based on their self-interest more than anything, and not necessarily on any sort of political ideology.

Q. You wrote "The Googlization of Everything" in 2011. Since then, have we progressed to the "Facebookization" of everything?

A. I wouldn't say that it's the Facebookization of everything – and that's pretty clumsy anyway. I would make an argument that if you look at five companies that don't even seem to do the same thing – Google, Facebook, Microsoft, Apple and Amazon – they're actually competing in a long game, and it has nothing to do with . It has nothing to do with your phone, nothing to do with your computer and nothing to do with the Internet as we know it.

They're all competing to earn our trust and manage the data flows that they think will soon run through every aspect of our lives – through our watches, through our eyeglasses, through our cars, through our refrigerators, our toasters and our thermostats. So you see companies – all five of these companies from Amazon to Google to Microsoft to Facebook to Apple – are all putting out products and services meant to establish ubiquitous data connections, whether it's the Apple Watch or the Google self-driving car or whether it's that weird obelisk that Amazon's selling us [the Echo] that you can talk to or use to play music and things. These are all part of what I call the "operating system of our lives."

Facebook is interesting because it's part of that race. Facebook, like those other companies, is trying to be the company that ultimately manages our lives, in every possible way.

We often hear a phrase called the "Internet of things." I think that's a misnomer because what we're talking about, first of all, is not like the Internet at all. It's going to be a closed system, not an open system. Secondly, it's not about things. It's actually about our bodies. The reason that watches and glasses and cars are important is that they lie on and carry human bodies. What we're really seeing is the full embeddedness of human bodies and human motion in these and the full connectivity of these data streams to the human body.

So the fact that Facebook is constantly tracking your location, is constantly encouraging you to be in conversation with your friends through it – at every bus stop and subway stop, at every traffic light, even though you're not supposed to – is a sign that they are doing their best to plug you in constantly. That phenomenon, and it's not just about Facebook alone, is something that's really interesting.

Q. What are the implications of that for society?

A. The implications of the emergence of an operating system of our lives are pretty severe. First of all, consider that we will consistently be outsourcing decision-making like "Turn left or turn right?," "What kind of orange juice to buy?" and "What kind of washing detergent to buy?" All of these decisions will be guided by, if not determined by, contracts that these data companies will be signing with consumer companies.

… We're accepting short-term convenience, a rather trivial reward, and deferring long-term harms. Those harms include a loss of autonomy, a loss of privacy and perhaps even a loss of dignity at some point. ... Right now, what I am concerned about is the notion that we're all plugging into these data streams and deciding to allow other companies to manage our decisions. We're letting Facebook manage what we get to see and which friends we get to interact with.

Citation: The rise of Facebook and 'the operating system of our lives' (2016, July 13) retrieved 29 March 2024 from https://phys.org/news/2016-07-facebook.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Facebook's latest news feed tweak: This time, it's personal

22 shares

Feedback to editors