- But also keep on writing.

During one French lesson, our class was to watch the movie Populaire about Rose Pamphyle becoming the fastest typist in the world. During training, she expresses her anxiety about taking part in a typing competition. Her coach goes to console her, and his last sentence before the movie cuts to the actual competition was (translated): “In this world it is enough to be good at a single thing”. This article deals with my realization that it seemingly seems to be increasingly the case that in this world you can at most be good at a single thing.

Your opinions are not safe

The expression “Write what you know” tends to be most associated with general writing advice, probably best expressed by the TV Tropes page of the same name. In the outrage prone world of traditional social media, the term most often becomes a unifier of criticism against a particular work, in other words, the author had no idea what they were talking about. And… “we live for the day an ignorant prick gets dunked”. Not that I am above this kind of entertainment, particularly enjoying a fireman calling out unrealistic depictions of his occupation in US shows, but the point is that “dunking on” someone who veers out of their lane of expertise is cultural practice in this online climate.

Not that outrage about misinformation is unjustified. For example, under the fireman videos you find compelling comments about how these shows are partially responsible for emergency-related deaths. The danger of those shows then isn’t just that they’re wrong, lots of things are wrong. The problem is that they’re wrong and popular. (This sentence was appropiated from Ian Danskin’s video on Phil Fish) Even worse, as they are TV shows, they are seen as being in the vague cloud of what I’d like to call “official media”. Official media has writers, editors, and producers. They are big companies, and they come with your TV subscription. How dare they have this many layers of bureaucracy and funding and still be this lazy? (Barring the fact that “reality TV” is famously produced on a shoestring budget using overworked undercredited authors)

Now, it is relatively justifiable to feel confused about a TV show about firefighters not actually talking to any firefighters about objective plot points. The issue is that in a consequentialist approach, i.e., focusing on the outcomes of that misinformation, it does not matter if the misinformation was produced in a gigantic company or by a single person. The main metric which is proportional to perceived “damage done” is reach, audience, influence, whatever one wants to call it. And one of the principal effects of the internet was the democratization of reach (within certain bounds).

I do not think this will be the last time I will mention Twitter in an article. It is absolutely fascinating as a sociological phenomenon, and at the same time absolutely horrid. It gives me the feeling of an architecture student watching a controlled demolition of a cathedral. If I were to label that cathedral “democracy” I could even make a lazy metaphor about the state of social media politics. This is of course interesting considering Twitter in the early days was often thought of as a digital public parlor of sorts. The truest place of democracy; everyone can speak their mind freely. At which point the nazis moved in. Keeping it brief, Twitter is not a bastion of civilized conversation with a particular perk for dogpiling, as Kieran Healy points out:

The predictable result is that users may find themselves in a storm of unwanted and perhaps malicious or abusive attention if something they say is widely retweeted.

Although Twitter’s design particularly supports this form of engagement, I do not want to delude myself that other social media are immune to this, this sort of backlash is somewhat baked into human social nature. [Citation Needed]

This is the perfect storm, then: A somewhat public person posts wrong information, if on purpose or not. The post generates outrage and “call-outs” which any algorithm aiming to keep people on the platform will want to boost. Nastily, even outrage tends to generate a lot of traction in terms of likes, retweets, reblogs, shares etc. which will be publicly visible to any possible future would-be-outragers. So the metrics of the outraging post keep growing, which bolsters the algorithm but also people’s idea that this is not only wrong, but wrong and popular. So what do to avoid it? Don’t be a public person.

I presume writing about celebrities was much easier for commentators of previous generations. You were famous if you got on TV, otherwise you were just a regular schmuck. These barriers of old have been broken. Oh, how they’ve been broken. Twitter again and especially TikTok are the prime suspects. If you put your opinion out there in public, it is fair game. One of the most ugly outbursts of this attitude are reactionary YouTube channels which fabricate a controversy and a straw man opponent out of a few Tweets. Some of these are highlighted in Shaun’s “Fake Outrage” videos.

Leave it to the experts

To summarize, posting something wrong on a public social media platform makes you liable to gigantic backlash if you were in the wrong place at the wrong time. The most straight-forward response to this problem then is to have no social media presence. This is of course a laughable proposition for many people, whose livelihood does depend on public interaction and networking. But for regular users, I do genuinely recommend cutting the cord on Twitter and its friends.

The solution most often proposed by the outragees is to “write what you know”. If you’re writing a show about firefighters, consult an expert. Ideally, leave it to the experts. They know better than you. There is probably someone out there who has spent 25 years living and/or researching this exact topic. Do you really think that you can add anything of value in the public conversation?

The last few sentences could have been authored by my imposter syndrome itself. Although imposter syndrome is not the right word, because it is objectively true that someone with 25 years in topic X will know more than me about topic X. It is the same syndrome which causes quitting an attempt if you do not succeed in the first few minutes, hours, or days. It may just be laziness. Or embarrassment about the work I create. The issue, of course, is, it has a point.

The probably nastiest version of “you should probably stick to what you know” is when a dominant group portrays another group inaccurately. One of the bigger in-jokes in the writing community is the idea that there are many horrible attempts of men trying to write women. A staple of the genre is mentioning secondary sex characteristics a few seconds in. The progressive argument therefore goes “If you don’t know a minority culture well, do not write about them”. In the wonderful “Wonderbook” Jeff Vandermeer puts it quite delicately:

Unless you are writing about a culture you come from or an approximation of that culture transposed into the fantastical, this is the most difficult vantage point to write from because there’s a different cultural default.

The soft fix for not knowing but still wanting to write is of course “doing your research”. This may be reading primary sources, comparing with other texts or consulting experts. This is at this point common cultural practice on the educational side of YouTube, and most of the best pieces of content work incredibly close to primary sources. Notably, the content creator of course still has a lot of creative control, in particular they can often choose which sources and expert opinions to highlight and which not. They are in some sense the editorial department of the newspaper, encompassing all of science.

The seminal work on all things science communication is Tom Scott’s talk “There is no Algorithm for the Truth”. Tom’s talk tends to circle around the devil’s bargain that any platform has to make with clickbait and its cousins when it comes to keeping user’s attention. (And if one looks at Tom’s thumbnail catalog now, it is apparent he is also grappling with this bargain.) Talking about star power of certain science communicators he remarks: “I’m sure they did a wonderful job, I’m sure it was well-researched, but [the show] with the extremely qualified dual biologist and astrophysicist no-one’s heard of would not have had as many people watching it”. The other devil’s bargain then is the attachment the audience has to certain persons, irrespective of their qualification. They want to hear it from the person they know and trust, even if said influencer is provably less trustworthy than, e.g., a researcher on the same subject.

As such, the morally ideal solution for people with a wide reach is to elevate qualified voices. (This is echoed in the progressive slogan “elevate minority voices”) But again, the devil’s bargain in this part is that audiences still want to see said person with a wide reach. Tom Scott has remarked that viewers tend to click away from his video once he has handed the microphone to the expert.

What about normal people? (I am normal people)

I’m a normal person in the sense of public reach. I do not have a big media presence or anything resembling a following. Responsibly posting links to expert material is barely perceptible in the public sphere. Neither do I possess any decade long expertise to speak of. So I should probably keep my mouth shut on social media. Or at least only talk about my personal experience.

Although I am not above mentioning my own feelings and opinions, I prefer to stay grounded in academic topics. These are exactly the topics with the decade long experts behind them and years of research has probably covered every speck that my uneducated brain could lay its figurative eyes on. The solution to said problem is a simple one, of course: Become an expert. I am a student, after all. Modern academia takes the trend of specialization within society to its logical conclusion: Niche experts with entire life’s dedicated to their particular subject. Easy then: Shut up now and write about only Computer Science later, once I’m smart.

So why do I plan on writing and publishing about tons of different subjects anyway? Firstly, one may notice that I am sharing these thoughts on what in this day and age is a relatively private place. Although publicly available, this blog has no real public presence outside of it. Hence, the risk of becoming Twitter’s “villain of the day” have been greatly diminished. Secondly, because I like writing, and I probably would not be writing this much if I could not share it with other people. Call it the need for intellectual debate or narcissism. It’s probably both.

If in this world, it is good enough to be good at a single thing, then for each single thing, there is probably somebody who is better than you. If one then is interested in writing about several things, they will have to accept that they won’t be the principal arbiter of those things. Our world meanwhile does not consist of single things, it contains a whole which is but interconnected things. Therefore, intellectual interest in numerous fields can give unique insights into the world. With proper citations, of course.