On Techno-Optimism and Techno-Cancer

CS Sayers

In a television ad from the 1940s – one of the earliest to air – two doctors have just resolved a late-night emergency. “This night work’s kind of rough, isn’t it?” one asks the other. You wouldn’t know it to look at them: both men are well-groomed and relaxed on a couch. They are enjoying the coffee that a nurse has placed in front of them (presumably between saving the lives of other critical patients) and look for all the world like leisurely financiers who have just finished their hour of work for the day.

“That’s right,” answers the colleague. “But a Camel’s always a pleasure.”

Therein lies the secret. The narrator seizes his moment. “The pleasing mildness of a Camel is just as enjoyable to a doctor as it is to you and me.”

Camel’s Doctors Enjoy Camels campaign was centered on the promise of vitality and good health. The advertisements were sometimes accompanied by a list of different specialties that endorsed the product (surgeons, general practitioners, diagnosticians) and worked to position smoking along with diet and exercise as the keys to long life.

It was not a novel campaign. Camel was a latecomer to health-conscious smoking. Lucky Strike sent cartons of cigarettes to doctors in the 1920s and in 1937 Phillip Morris ran an advertisement “claiming doctors had conducted a study showing ‘when smokers changed to Philip Morris, every case of [lung and throat] irritation cleared completely and definitely improved.’” Cigarettes were marketed as health aids, appetite suppressants to prevent excessive eating, and a way to manage stress for hard-working professionals. Cool, well-dressed men and attractive, curvaceous women smoked.

Doctors and cigarette companies already knew that these campaigns were full of shit. The link between smoking and lung cancer was not difficult to make, because lung cancer was very rare before the advent of mass smoking and then it rapidly became more common. By the 1940s and 1950s, the link was no secret.

The tobacco companies dealt with this problem by lying about it. They covered up the evidence and told people cigarettes were safe. In 1954, consortium of tobacco makers released “A Frank Statement to Cigarette Smokers” in response to recent studies linking smoking and lung cancer. Recognizing the public’s concerns about health and safety, the statement invoked unspecified “distinguished authorities” who disputed the connection between smoking and cancer, maintaining “that there is no proof that cigarette smoking is one of the causes.”

Attempts to hold cigarette companies responsible for lying to the public took half a century. In 2005 a federal court found that major tobacco manufacturers had conspired to lie about the health effects of cigarettes and suppress relevant evidence. That was the culmination of years of litigation and eventually led to remedial orders forcing companies to display the warning signs present in stores where cigarettes are sold today.

In the meantime, the tobacco companies profited. Based on the profit per cigarette sold and the average cigarettes associated with a cancer death, Robert Proctor estimates that each lung cancer death represents about $10,000 of profit for the companies. Separately, the CDC estimates there are close to half a million smoking related deaths in the US annually.

Over a few decades that works out to a lot of bodies, and billions of dollars in profit. If you aren’t squeamish about killing half a million people every year, it’s an excellent business.

Today there are product warnings and public awareness campaigns and the government has imposed basic rules on the sale of tobacco products, such as age requirements. Few people are left on the front lines of the smoking wars.

If you’re my age or older – I’m a younger millennial – then you remember a time of optimism about social media. There was a moment when social media was a novelty, and then a danger, but it didn’t take long for it to become a cure. By 2011 at the latest, social media was becoming part of our prescription for political health. That year, people revolted in mass protests against dictatorships in Tunisia. Then in Egypt. Then in Syria.

It looked as if North Africa and the Middle East were experiencing the sort of epochal transformation that the US and Europe underwent during the rise of the liberal democracies. Excited commentators called it the Arab Spring. Then other names emerged. The Facebook Revolution. The Twitter Revolution.

Evronia Azer (in the piece I just linked) tries to parcel out truth and deception in these names. Many of these revolutions were never completed, and some conflicts remain ongoing. Regimes had an easy time reacting to some of the techno-social elements by cutting off internet access and forcing the movement to become social in the old fashioned way, by mobilizing and organizing in offline spaces. The impetuses behind the revolts all predate social media platforms.

It’s true that some organizers did use social media platforms to spread their message and to recruit people for actions. It’s a bit like the doctors sent cartons of Camel or Lucky Strike: it’s in front of them. It’s what they had, and good organizers use the tools they have.

Mark Zuckerburg’s early vision for Facebook was not to cause political revolutions. Fred Vogelstein described Zuckerburg’s ambition to create “the biggest, most valuable database in the world.” The real revolution underpinning the Arab Spring was a data revolution. Becoming the biggest database in the world is lofty and lucrative, but it’s not a goal that has anything to do with connecting people or causing social change.

How does the database become a political salve? The key is that revolution was a way to drive user growth. Sarah Wynn-Williams says as much in her memoir of her time at Facebook, Careless People, which begins shortly before the Arab Spring. Her impression is that Facebook executives are obsessed with growth above all. In fact, issues of international exposure, the world outside the US, or any connection to politics are seen as quaint and uninteresting. A critical moment in her journey to selling Facebook on the importance of international politics is the realization that the Arab Spring is a marketing tool. After being dismissed by a high-up when Wynne-Williams first tries to pitch her international expertise as a valuable resource, Wynn-Williams is then contacted with a follow-up question: should Zuckerburg take credit for the Arab Spring? The executives were realizing that if people believed Facebook could help you change the world, then they would want to use it. Who doesn’t want to change the world?

The revolution will not be televised, but maybe it will involve some click-through traffic. At least, people who sell ad imprints on click-through pages would not mind in the least if you thought as much.

Organizing happened on social media. Some organizing still does. But that’s not what the tool was created for, and more importantly, the tool never belonged to the organizers. Commentary on the Facebook Revolution was buying wholeheartedly into a sale pitch designed to drive growth. Take a Camel for your weight and take Facebook for political disaffection; call me in the morning.

The wave of social media excitement persisted. In 2014 danah boyd (she does not usually capitalize her name) published It’s Complicated: The Social Lives of Networked Teens. The book was based on research from boyd’s career, which has been marked by various high-profile appointments including a current academic position at Microsoft. It’s Complicated argued that social media was a place where young people could create their own ‘networked publics,’ spaces for youth culture in an age where physical haunts such as malls were becoming less accessible. Social media was a vital and creative refuge from parents and oversight. Certainly, boyd observed, young people struggled with certain dynamics around peer groups and usage. Ultimately, though, social media was life-enriching.

The book was influential in certain circles. boyd and I received our PhDs at the same university, she many years before me. Our department taught courses on youth culture that made use of her material. This was the golden age of social media optimism. New political and organizing possibilities, new emotional spaces, new forms of public connectedness and social interaction. Social media was going to let us remake the future by rebuilding the politics and communities we had lost.

A curious dimension of techno-optimism, which is inherently a form of futurism, was its pre-modernist longing for the prelapsarian world.

Ever since Reagan launched an all-out assault on American unions and the Fordist careerist model of work collapsed slowly into a hellish morass of contingent gigs and pensionless jobs, Americans have been feeling an increased sense of disaffection and detachment from their lives and societies. The marketing pitch for social media, in a nutshell, was that it could cure neoliberalism. Is the mall closed and everyone at the bowling alley is in a lane of their own? Is your society collapsing? Don’t fret; we’ve built a new one. On the internet, the mall is always open, and the bowling apps have matchmaking built right in.

I was never comfortable with social media optimism, even while I taught the material. Maybe the age we live in has burned out something important in my soul, but optimism feels as if it entails a high risk of naivete. Social media isn’t a public or a space or even a tool of connection. It’s a marketing platform. It’s owned by billionaires, people like Zuckerburg, Musk, Page and Brin, and they use it to market you to people who sell ads so that those people can use the platform to market you products.

Young people don’t have the agency to shape new publics on social media. That’s not a slight against young people. It’s an honest recognition of the fact that the only people who have that agency are the platform owners. Zuckerburg (more specifically, engineers who answer to him) controls what Facebook shows you. If Facebook shows you things and people that help you build that sense of community, that’s because Zuckerburg is allowing it. When Facebook’s rulers determine it’s more profitable to show you reams of AI-generated slop art, or the half-legible political ramblings of offshore troll farms, or infinite ads for gambling platforms, then that’s what you’re going to get.

It doesn’t matter if those things aren’t part of your vision for your public life. You are not in control of your social media. The people who make the algorithm behind it are. Convincing yourself otherwise because they gave you a block and a follow button is just wishful thinking.

This is the nature of the platform that optimism obscures. People talk about these platforms as if they have one that is their own; as if it made sense to speak of ‘my’ twitter. It’s not your twitter, or your X. It’s Elon Musk’s. You borrow an account from him subject to his desires and impositions, which is not the same thing as ownership.

Cigarettes were followed by hypertension and lung cancer. Social media is followed by suicide and political collapse. As social media platforms became popular, young people started killing themselves in elevated numbers. According to the Yale School of Medicine, from Facebook’s early days in 2007 through 2021, the youth suicide rate increased by 62%. That’s the aggregate number; for some groups its much worse. The suicide rate for black teens in that period increased by 144%. That increase is so large that astonishment threatens to overwhelm a sense of horror at the scale of it.  

As with cigarettes, researchers looked at the obvious associations. When lung cancer became more common, it was concentrated in people who smoked. As youth suicides became more common, they were concentrated in people with higher social media use.

The companies knew, because they always know. No one on earth knows more about how social media platforms work and what they do than social media companies. In 2021, the Wall Street Journal reported on internal research from Instagram, which is owned by Meta. Instagram’s own researchers concluded the platform caused depression, body image issues, and even suicidality. Instagram is aware its product is toxic. Not only that, but it intentionally amplifies that toxicity to increase profits. Instagram can target advertising to people it knows are particularly vulnerable in their self-image. For example, they know that teen girls tend to delete selfies when they are not feeling good about their appearance – which provides the perfect moment to serve an advertisement for weight loss or beauty products.

Take a moment and bask in the moral vacuity of a group of people who realized that their product produced compulsive and possibly life-threatening disorders, and concluded that this presented an opportunity to increase ad-click KPIs.  

Instagram didn’t release that research voluntarily. The Journal obtained it as part of a large collection of documents leaked by an insider. The documents paint a straightforward picture. Wynne-Williams was telling the truth: the only law is growth law is growth. Every person of any age the platforms can reach is another product they can sell to advertisers. Suicides are collateral damage in the only battle that really matters, which is click impressions. One of the many ways that social platforms are like cancers is that they are indifferent to the damage they cause to their own host in their endless drive to grow.

Meta is not unique. Recently, internal documents from TikTok’s research on its own product became available as part of ongoing litigation (the documents were supposed to be redacted, but this was not done properly. The Kentucky Attorney General placed black boxes over the relevant text in PDFs, which allows the text to be copy-pasted to a new document). You can find some excerpts here.

TikTok has an internal research arm called TikTank. TikTank determined TikTok is “particularly popular with younger users who are particularly sensitive to reinforcement in the form of social reward and have minimal ability to self-regulate effectively” and “do not have executive function to control their screen time.” Other reports from TikTok’s internal research similarly emphasized that TikTok had a negative effect on sleep, productivity, focus, happiness, and overall life functioning. TikTok realized that this vulnerability is an opportunity and tailored strategies such as push reminders to reel back in any users who leave, and debut strategies like TikTok Live that amplify these concerns. In the documents TikTok describes its own strategy as “cruel” and acknowledges that it leads to the proliferation of child sexual abuse material on Live. Other risks they are aware of include self-harm and suicide.

Social media is destructive to the user, and the companies know it. But it is also destructive to the social world. In pitching their product as a cure for a rotted real world, social media companies came to understand that rotting the real world further only increased the appeal of their product. Careless People makes it clear that Facebook understood this dynamic. They knew, for example, that the military government in Myanmar used the platform to organize the 2017 mass rape and genocide of Rohingya Muslims. They knew that the platform helped destabilize American democracy and empower Donald Trump. Facebook offered the Chinese government “special access to users’ data” and bespoke censorship tools as well, in order to help the regime maintain “safe and secure social order.”” War crimes, fascism, censorship – any and all of the evils that Zuckerburg might have claimed to fight in taking credit for the Arab Spring – are growth tools in the right circumstances.

Conspiracism and post-truth politics, modern conditions which impair the ability of democracies to function, are similarly lucrative. Several media outlets reported Facebook’s role in making people, many of them older than millennials, believe QAnon was real. QAnon is the belief that the world and government are dominated by a cabal of cannibalistic pedophiles who traffic millions (!) of children a year and are engaged in battle against Donald Trump, who is in turn being aided by a secret military spy who is also a prolific chan poster, and maybe a time traveler. At least, that’s how it started. Over time the Q-universe has expanded to contain a multitude of even stranger and internally contradictory ideas. It’s a very bad thing to believe because it operates like a cult and pulls people away from reality and family connections. There are many stories of how it has torn families apart. Matt Coleman killed his two children because QAnon convinced him they had “serpent DNA”.

Adults can sometimes pretend only youth are vulnerable to social media networks, but everyone is.

Coleman’s belief system did not spread because Mark Zuckerburg decided people really needed to know about serpent DNA or adrenochrome. I assume (it’s not like I know the guy) that he doesn’t care about this stuff at all. The spread happened because of the algorithm. Facebook’s secret math engine weighted for advertiser satisfaction and maximum engagement noticed that when certain demographics (retired white people) got served posts about satan-worshipping pedophiles in Pizza restaurant basements, they proceeded to spend a lot of time clicking links on Facebook. That’s pretty much the whole thing. The health effects of the product are irrelevant. Did the nonstop fire hose of paranoid fantasies melt your grandfather’s frontal cortex? Sorry. Infinite growth comes with some casualties.

Elon Musk’s purchase of Twitter and conversion of it into a haven for white is not a malfunction. The system is working as intended, funneling power upward. Concentrating all data about everyone everywhere in the hands of a small elite who own the social media platforms isn’t a tool for political movements. It’s a tool to end them.

Cigarette companies didn’t want to give anyone lung cancer. They just didn’t care if they happened to do so. What they wanted was to sell packs of cigarettes. The cancer was a side-effect.

Social media was already verging on media without human intelligence years ago because of the way the algorithm picks up, rechops and recontextualizes human output. Now that generative language models like ChatGPT and Gemini can produce content and posts without human intervention, we don’t even have to provide the raw materials for the algorithm’s decontextualization. Social media is an increasingly obvious malapropism. It’s inhuman media. Robots create content that gets fed through the algorithm, processed into distilled emotional manipulation, and then fed back into you to keep you clicking.

Last week my father – a smart man with a Ph.D., but an older man who was raised in a different world – asked me about a YouTube video. The entire video, script and all, was AI generated. He had no idea. He’s not the only one. The channel had about 75,000 subscribers.

How much worse will things get when we don’t even need humans to invent the conspiracy theories that get served to us? How much worse will things get when the speed of new conspiracies and new fake information exceeds the speed of light, approaching infinity as the computational power fed into LLMs increases steadily?

The social media and tech giants make a product that is killing us. A lung cancer death represents $10,000 of profit to a tobacco maker. How much is every teenager who hangs themselves worth to Twitter?  What’s the click return average connected to every woman raped in the Rohingya genocide?

The most horrifying thing about these questions is that they do have answers.

We have begun to regulate cigarettes. In a sane world, we would regulate the tech and social media giants as well, but our collective political refusal to infringe on even the smallest share of the NASDAQ’s valuation has created a situation where there are no brakes and no limits. This circuit can only amplify in intensity.

All circuits will eventually overheat given endlessly increasing current. The heat runs away, the wires melt, sparks fly and fires start. Our collective meltdown is coming. We cannot handle infinite intensity.

The techno-optimists’ basic mistake was to assume that people controlled these platforms. You can follow accounts you want and block accounts you don’t, and that offers the illusion that social media is a little bit like an email account: it’s yours. People send you messages in and you send messages out and sure there’s some spam from time to time, but you can block and delete it. Even in email, the size and scope of junk messages can begin to overwhelm the actual nature of an inbox, but an email account is something you control.

Social media is different. It’s not yours. An email account is designed to deliver you messages that other people send to you. A social media account is designed to keep you using social media. The algorithm learns from engagement, and it can be (and always is) changed at any time without any warning or transparency. You can follow or block who you like, but the second the platform determines they’ll get more engagement from you if that changes, it will change. Your feed will be more content you haven’t followed, or blocks will be removed, or whatever other subtle and clever tweak engineers can invent to take advantage of your human nature will be put into place without your knowledge. It’s akin to an email account that uses an algorithm to change the content of the emails you receive from other people with the goal of getting you to spend more time on email. That would not be ‘your’ email. Those messages aren’t for you, no matter how much you pretend they are addressed to you.

New platforms emerge regularly. Will one save us? Can we all forsake Twitter in favor of Bluesky? You might well find Bluesky more pleasant, for the moment (some people do). That doesn’t change that it has its own algorithm and that it is trying to grow. It needs you to use it and it has engineers dedicated to tricking you into doing that more than you want to, and it has not explained how it does that. One day, when the owners decide the platform is large enough that there’s more profit to be made from misery, everything will change. If they don’t want to inflict misery, they’ll sell it to someone who will. After all, one of Bluesky’s founders is Jack Dorsey, who created Twitter and then sold it to Elon Musk.

What about Threads? Threads is owned by Meta and we’ve already seen how responsible that company is with social media in their stewardship of Instagram. Zuckerburg just appointed one of Donald Trump’s former National Security Advisors as President of Meta, so in future cases of conflicts between authoritarian concentration of power and the interests of Threads users, you can look forward to leadership exercising the humaneness and judgment that characterize Donald Trump’s national security apparatus.

You do not own these platforms. These are not your spaces. Neither do I. I have accounts on these platforms, but I try not to be attached, and I know that I’m inhaling something bad for me.

The scale of social media is massively larger than the scale of smoking, and accountability hasn’t even begun. For the moment, they can keep lying about their product. A few lawsuits are trying to discover what some of these companies know, but the full scale of it will not be revealed for many years. One day we will find out the full scale of it. How much they know about the political catastrophe, the emotional catastrophe, the cognitive catastrophe that these platforms have unleashed.

We are in the position of people in the 1940s who saw the connections between cigarettes and cancer. We know it’s happening, and they know it’s happening, but they know that denying it is good for the bottom line.

One day, maybe we will get control of the situation and people will make informed and cautious choices about their use of these products, with the requisite warnings and information. In the meantime, if you want my advice, don’t smoke those things. They’ll kill ya.