The integration of social media and technology in our lifestyles creates a dependence on these platforms and personal devices to work alongside us throughout the day. Portable devices, personal appliances, and operating systems have become so involved in everything we do that there is rarely ever time spent where our attention isn’t directed towards or captured by one of many of these systems.
The attention economy has forced companies and businesses to compete against each other in a race to steal our attention and hold it for as long as possible. Instead of attempting to instil a desire within a potential target audience in order to sell an idea, product or service, through the use of Facebook and other platforms, companies can sell us things they know we already want. Using personal data collected from the social media platforms we use and other operating systems and devices, businesses no longer struggle to decipher what a customer wants. This information is voluntarily handed over to these companies, by us, because our desire to have convenient access to the internet and remain connected in the digital sphere is greater than our desire for privacy. However, what we put out will always be what we get back.
In Siva Vaidhyanathan’s “The Attention Machine”, she discusses the concept of filter bubbles as a major problem as a result of the emergence of the attention economy. Companies know that in order to sell an idea, they can’t promote an opposing belief. This causes narrowed vision — funnel vision — amongst platform users, making it difficult for them to think critically and objectively as all the information and news they receive is in support of the opinions they hold already. Facebook, and platforms like it, seek to manage our lives by presenting us with information that they know we want to see, in terms of the news, advertisements, and social groups. Their goal is to give us what we want, before we know we want it.
Digital media companies do not want to be contained in a screen, they want to infiltrate our homes (e.g, Amazon Echos, Google Homes) and learn as much about as they possibly can. The ultimate drive is to become the operating system of our lives, to capture our attentions, to own our bodies and our consciousness. In this essay I will explore how the information overload brought on by the attention economy dissuades us from thinking critically and how the integration of social media platforms is detrimental to the way we form our beliefs.
The attention economy fueled the rise of industries devoted to capturing our attention. Since the birth of the World Wide Web, and the subsequent push to make all online content “free” to allow easy access to information and inspire creativity, companies have adapted from selling us goods and services to using advertisers as a way of earning profit (Vaidhyanathan 81). Through the use of personal data collected from our social media accounts, companies can target us using specific brands that we have announced we associate with or support. Moreover, with the integration of operating systems and devices in households and on bodies, these companies have access to more personal data than ever before.
As Vaidhyanathan states,
“[Through the introduction of] personal assistants, new interfaces, self-driving cars, internet-connected glasses and watches, and virtual reality goggles, these companies hope to earn the trust of consumers and regulators so that they can set the standards for the transactions that can make this operating system work seamlessly and efficiently” (100).
These devices are sold to the masses with the promise that each new addition will make daily tasks easier, make tasks more convenient, like a blackbox fitted to the operating system that is our bodies, they ultimately work to release us of the chore of thinking for ourselves. Vaidhyanathan argues that we may have reached “peak advertising”, where our attention is so completely harvested that we have no more to give, and that we no longer have room for dependable and affordable news (83). She says that our political lives are now multiscreen affairs, that it is difficult to engage and participate in political affairs when we are constantly being pulled back to advertisements and social media platforms that are begging for our attention (87).
When one advertisement fails, we are approached by thousands more. The attention economy has made it so that social media platforms run on advertisements. It is profitable for Facebook to design a platform where users are required to brand their profiles, select who they want in their social circle, and sell to the world the best version of themselves. This makes users, as consumers, easier targets to market to as they have grown accustomed to advertising and being advertised to. On social media, all we see are advertisements. We are presented with news that the algorithm hand picks for us, tidbits of information that align with our political beliefs and interests in order to ensure that we keep coming back to the platform and interact on it.
It is extremely difficult to think critically in the attention economy when companies are selling back to us our own beliefs. Social media sites have essentially created a built-in, personalized confirmation bias filter to skew the content we receive to increase our chances of engagement on the platform, which provides advertisers with an active target audience to market to. And we can’t turn it off.
Social media platforms are detrimental to the way we form our beliefs because it only works to reinforce what we already believe through overexposure. This shapes our sense of self because algorithms inform companies of the content we like to see, and then present them to us in a manner that would evoke a purchase, or action.
Political affairs and events are very easily fueled on social media platforms as businesses can specifically target supporters. In Safiya Noble’s “Algorithms of Oppression”, she states that, “Algorithmic oppression is not just a glitch in the system but, rather, is fundamental to the operating system of the web” (10). Social media platforms provide the systems and algorithms to reinforce propaganda.
Vaidhyanathan argues that Facebook is the most pervasive digital company as it is increasingly the medium of choice for propaganda — because it works (101). In 2016, Russian agents used Facebook’s advertising services to target Trump supporters and create over 400 groups that motivated Americans to attend events and protests that would cause chaos and disrupt the campaign (88). Facebook’s main form of governance is machine learning, so this event stayed under the radar until it was too late and the damage had already been done. Facebook tailors every news feed in response to a user’s engagement markers (Likes, searches, comments, clicks, etc.). The relevance of a topic to you will determine if you see a post, and where you see it on your feed.
This phenomena is called “funnel vision”, where you see more posts from friends and pages who share the same views, trapping you in a filter bubble where your beliefs are consistently reinforced (87). On Facebook we are rarely, if ever, presented with an idea that hasn’t been pre-approved by the filters and algorithms in place that have tested a post’s relevance to you. If we only get our news through social media, it is very likely that the information we receive have been deemed “relevant” by a platform’s algorithm, leaving out issues that may still concern us simply because we haven’t expressed an interest in it before. This causes a narrowed vision of the world, where everything directly in our line of vision is exactly what we want to see. It is incredibly difficult to be involved in social and political movements when we are not even made aware of the issues.
Vaidhyanathan states that we all lean towards homophily, but social media can choose to either reinforce it or correct it (95). Social media platforms are currently set up to encourage and amplify our desire to belong to groups that share our beliefs. By constantly selecting what content we see and keeping tabs on how we interact with each other, Facebook, and other platforms, actively avoid introducing controversial topics that may spark discussions or change our viewpoints. They allow us to stay stagnant, forcing people to remain caught up in their own values. The integration of social media and other personal operating systems is dangerous because, in a way, it discourages growth or development.
Critical thinking is not necessary when algorithms, apps, and systems perform tasks for you and selectively brief you on news that an operating system decides is relevant to you. The simulation information overload brought on by the attention economy discourages critical thinking by providing us with everything an algorithm assumes we want to know and hiding things it deems irrelevant to us. The integration of social media platforms and operating systems into our lives is detrimental to the way we form our beliefs as it narrows our vision of the world and plays to our desire to be amongst like-minded individuals. Social media is a fantastic networking system that allows for discussion and connection, but it is also an advertising hub for companies and businesses.
The web should be a democratic space, free for users to discuss and interact on. If everything in the digital sphere has a price, what is the attention economy costing us?
Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New
York University Press., 2018.
Vaidhyanathan, Siva. Antisocial Media: How Facebook Disconnects Us and Undermines
Democracy. Oxford University Press, 2018.