And what it reduces us to.
COVID-19 as virus is, in essence, viral. The medical definition of the term describes how a virus spreads, infects, and mutates. COVID-19 as content has also gone viral: Its domination of the news cycle, of memes, of misinformation has enabled COVID to spread in a different way—as something authored that’s shared over and over again. This modern and metaphorical definition of virality, which before 2020 is probably how most of us used it, refers to how a piece of information (anything from an image to a social post) can be circulated rapidly from user to user until you can’t not see it.
We’ve co-opted the medical terms virus and viral, because the engineered viruses that infect computers and the way content spreads rapidly online actually mimc real illnesses. And I wonder if the internet has reshaped the way we think about the pandemic. As Susan Sontag writes in her collection of essays Illness as Metaphor, the problem with speaking metaphorically is that it “bifurcates” reality, giving us “the real thing and an alternate version of it.”
Sontag pretty much sums up life with the internet: there’s always the “real” thing and some alternate version of it. Of course it’s more nuanced than that, especially now that there’s no longer a clear distinction between life online and offline. Maybe how much time we spend online engaging is still a choice for those of us who work or attend school remotely, for those of us whose livelihoods are wrapped up in how much email marketing and social media can generate for us. Remember that Facebook, Instagram, and WhatsApp outage that happened recently and forced us all to resort to Twitter to express our feelings through memes? We all laughed about our platform dependence, knowing, of course, that it’s ridiculous that we don’t know what to do with ourselves when the tools we use for distraction are down. But some small businesses and creators who rely on Facebook and Instagram for more than just a way to tune out a meeting claim they lost anywhere from a few hundred dollars to $5,000 in the few hours these platforms were down. For anyone still wondering whether Zuckerberg owns us yet, the answer is this: He does.
The platforms we use are so much a part of us now that they’ve come to separate and divide us generationally. At a coffee shop a couple weekends ago, I overheard a conversation between a man and woman who were meeting for their first in-person date after meeting online. The man was talking about some TikTok videos he really liked and explaining to the woman why he preferred the platform to Facebook and Instagram. “I just like that the more I use it, the more personalized it gets, so I only see what I want to see” (this is actually one of the biggest problems with social media and its algorithms, but let’s not harp on that too much here). When the woman said that she didn’t use TikTok and didn’t really get it, the man politely asked how old she was. “Twenty-eight,” she replied. “Ah, see, you’re a little older than me. I’ve noticed there’s definitely an age divide,” he theorized. I’ve heard this theory, too, and as a 32-year-old I can attest that I have no interest in trying any new social platforms that pop up, though I would like to run away from the ones I currently use. I guess I’m set in my ways; the platforms I’ve chosen are the ones I’ll stick with. It is strange to feel out of touch with what the young people are into. My hunch is that Zuckerberg feels the same way.
The platforms I’ve chosen are the ones I’ll stick with.
While the choice about how and how much we participate online might still belong to us, we don’t have the choice to not participate and play an “innocent” bystander. Zuckerberg owns our souls now; there is no moral high ground. Even if you’ve opted out of Facebook, read newspapers in print, only shop in brick-and-mortar stores (which you probably couldn’t do much during the worst of the pandemic), do everything you possibly can to stay off the internet, you still have a digital trace. It’s naive to think that the newspaper you subscribe to and the stores you shop in don’t have data about you stored online (at least in the U.S. where there are very lax laws about data storage) for marketing purposes. Never mind that information about your very identity and background is accessible online. We all have a digital self and a corporeal self, and they are one.
This is all to say that there isn’t a “real” COVID-19 and some alternate version. There is one COVID, and it is viral–but which viral do we mean? We have a fragmented understanding of the virus, one that’s been shattered by the filters of mainstream media, social media, right-wing conspiracy theorists, and reputable sources, and projected to us as a seemingly “complete” picture. What has always been both good and bad about the internet is how it democratizes the way information is created, spread, and consumed. It could, if it really wanted to be (if the people who “control” it really wanted it to be) a space of connection where all our stories could exist, where privilege could be wiped away and we could all have access to the information and knowledge that the World Wide Web promises is a click away. This is the utopian vision for the internet, anyway. As we know, that isn’t the story of the internet. It’s more of a land grab. It’s a fight for domination. Everyone’s yelling at each other, no one is listening. We aren’t even saying anything.
There is one COVID, and it is viral–but which viral do we mean?
The internet and online information retrieval systems like Google were built on the assumption that we want information and will seek it out if it’s easy to do so. That’s not really how it works. A principle in the field of information science, introduced by Calvin Mooers in 1959, is that we won’t use an information retrieval system if we believe the new information we’d find would conflict with what we believe to be true. Known as Mooer’s Law, this principle of information retrieval is often misinterpreted to mean that the information system needs to be easy to use, when really what Mooer was talking about was information avoidance. We tend to avoid seeking new information when we’re already overwhelmed by the information we have, or when we think it might shatter our world view. When we actually do seek information, we don’t try very hard, employing the “method of least effort.” Mooer made this observation decades before the dawn of Facebook and Twitter, two platforms that make it incredibly easy to access non-credible information and filter out anything we don’t want to know.
Even in this age where we actually have to set personal boundaries around how much news we can consume in a 24-hour period, credible articles and other content aren’t always accessible. News and media organizations need to make money, which means putting content behind a paywall or loading their sites with annoying ads. Academic research is typically only available to people who can afford the subscription fees like Universities. And the loss of local news publications across the country has sparked a rise in “news deserts,” gaping voids that we’ve allowed Facebook and blogs to fill.
Terrifyingly, some countries have built their own internet, which, in some sense, means they’ve built their own realities. When I was living in China 8 years ago, it was evident how much “the Great Firewall” had shaped citizens’ understanding of what was happening in their own countries and around the world. While I was there, I couldn’t even access Facebook without a VPN. I heard stories about news articles taken down as quickly as they were published. Controlling the internet is one of the ways China’s government aims to keep its people isolated. While a government-censored internet isn’t the answer, it’s clear that we need to do something. Despite what Zuckerberg says, the internet has never really been free and open, and regulations wouldn’t necessarily curtail that anyway. What they could and should curtail is his power over it.
While a government-censored internet isn’t the answer, it’s clear that we need to do something.
Facebook, Twitter, and to a certain extent Google have allowed us to make highly-curated worlds for ourselves—worlds that are only relevant to us. Bubbles. These companies also make money by making us creators and giving us the space to say whatever we think with authority before an audience we wouldn’t find anywhere else. This also seems to have shifted our idea of who or what an expert is. Let’s look back at the origin of using the word viral to describe content. The Oxford English Dictionary (OED) links this use case back to viral marketing, defined as a “marketing technique whereby information about a company’s products is passed electronically from one internet user to another” (OED, 11th Edition). Abby Ohlheiser points out in the MIT Technology Review that as having an online presence became a requirement, “virality dropped the connotation of having been engineered by people who were experts at getting your attention and became something more accessible and democratic.”
Is it really more “democratic” for everyone to be able to live their lives online? Especially when the companies that dominate the internet have proven to be shit at distinguishing between information that’s important and some guy ranting on YouTube because he didn’t get enough attention in middle school. How many of us feel compelled to post something online in moments when we feel people should be paying attention to us? A point brilliantly made in this satirical video about the delusion of being the main character in your life.
“Main character energy,” according to Kyle Chayka at The New Yorker, is what we all have now. Now being this weird period of going back to normal but still being in a pandemic. This period of reemerging. If you haven’t heard of main character energy, good for you! It’s when a person makes himself or herself the center of a “narrative,” “as if cameras were trained on [them] and [them] alone.” Perhaps you’ve had one of these moments? Moments where you could be doing basic things like walking down the street or grocery shopping, and you just “feel ineffably in charge” like “the world [was] there for your personal satisfaction.” In the article, Chayka interviews TikTok and Instagram influencers who refer to themselves as “CEO[s] of #maincharacterenergy” and create content focused on this one principle: “you must change your life.”
After more than a year of lockdown and other COVID restrictions, this is something many of us feel emboldened to do, including this reviewer. Social media was pretty much designed to make us all feel like we’re the protagonists of our own lives, rendering everyone else a minor character at best. This is the opposite of being a democratic storytelling platform, whatever that is. And it’s this mentality that has shaped the internet, shaped our response to COVID, at least in the U.S. We only care about how our stories unfold, and to hell with everyone else (though one of the influencers Chayka spoke with believes it’s possible for everyone having their “main character moments” to “coexist”).
Social media was pretty much designed to make us all feel like we’re the protagonists of our own lives, rendering everyone else a minor character at best.
The moments of community we do have online happen around a piece of content that’s deemed worthy of attention, be it a tweet or TikTok video. What determines virality, raising a piece of content to meme-status, is how often and how fast something is spread and repurposed. But a community based on the sharing of memes can’t be fulfilling; it’s passive at best. Passive engagement is what the internet is built on. It removes us from accountability and responsibility.
According to Ohlheiser, it’s time for us to rethink the way we use “going viral,” because it’s shaped the way we see our information ecosystem and role within it in negative ways (plus, it’s a global pandemic and people are suffering and dying, and this is no time for metaphors). We use the term to denote “authentic popularity,” Ohlheiser says, ignoring the role algorithms play in ensuring we see a particular piece of content. For example, the “documentary” Plandemic spread among anti-vaxxers on social media because conspiracy theorists “exploited the way social-media culture is intended to function.” This lack of understanding around manipulated popularity is not only why misinformation spreads (which Whitney Phillips, Assistant Professor of Communication and Rhetorical Studies at Syracuse University, says is an appropriate use of “viral”) but also why it’s taken as truth.
We’re not very good systems thinkers. To Ohlheiser’s point, I don’t think we see ourselves or the social media content we engage with as part of an “information ecosystem.” So we don’t fully comprehend how each new way of finding and accessing and creating information adds another layer to an already crowded media landscape. The dawn of radio was another layer added on top of print media. The dawn of cable news was another layer added on top of print and radio. The internet added another layer on top of print, radio, and cable, and so on. If we did think more about the ecosystem information is created, broadcasted, and shared in, then we might think more critically about how we’re engaging with it.
While describing content as viral is relatively new, the concept of information spreading rapidly and getting out of hand isn’t. The Viral Texts project—started by Ryan Cordell and David Smith—aims to create “theoretical models” to help us understand how content in 19th-century American newspapers and magazines could go viral. “Fugitive” was the term most editors used, mainly to describe poems that were “widely reprinted” and “often anonymous.” And because these published texts weren’t “typically protected as intellectual property,” they could circulate “promiscuously” beyond “authors’ and publishers’ control.” These “fugitive verses” are one of the types of content Cordell and Smith used to study virality by looking at which ones were reprinted in the most periodicals.
“Fugitive” was the term most editors used, mainly to describe poems that were “widely reprinted” and “often anonymous.” And because these published texts weren’t “typically protected as intellectual property,” they could circulate “promiscuously” beyond “authors’ and publishers’ control.”
Even before social media algorithms, there was a sense of “manipulated popularity” among these published texts. For one, the more often they were published, the less likely it was for people to ignore seeing them. And at that time, everyone in your social circle would be talking about an oft-published verse (perhaps speculating on its authorship), an observation that would have led Alexis de Tocqueville to write that “nothing but a newspaper can drop the same thought into a thousand minds at the same moment” in his 1855 work Democracy in America.
Anonymity and a certain “moral quality” seemed to be key to a text’s virality. More popular authors usually kept their names attached to their work, but an anonymous poem is what got people talking. They would manufacture a narrative to explain a text’s bibliography—there’s one popular poem called “The Children” that was widely thought to have been authored by Dickens. One way to look at these narratives is as a “social text,” as people coming together to create a “network author.” But can they also be viewed as conspiracy theories? Granted harmless ones compared to what we can find online today, but still an example of people engineering a narrative to explain something unknown. Most of these fugitive verses haven’t found much popularity today; perhaps that’s something we can take comfort in. Or maybe that’s just the benefit of paper.
What can we say of the “David After Dentists,” the meme that popularized “going viral?” Thanks to internet archives like the site “Know Your Meme” and the recent trend in digital creators turning their work into non-fungible tokens (NFTs), the legacy of these cultural moments will live on, even if their popularity doesn’t. “Meme” is another borrowed term, this time from “memetics,” which was first proposed by evolutionary biologist Richard Dawkins in 1976. Like a gene that holds hereditary information, a meme holds “cultural information,” and it’s in a constant struggle to survive and reproduce.
I’ll admit that I never heard of “David After Dentist” until I did some research for this review. I used to feel superior about watching the rapid spread of memes from the sidelines. But I know that this is where our society has gotten virality wrong—according to Ohlheiser, we’ve fooled ourselves into believing that [it’s] something we can observe without being part of.”
I give going viral 1 star.