58 Comments

Hmm, there are parts of this idea I like, but it feels incomplete. I agree with the first half (attempts to police misinformation will make the problem worse) and I like the idea of a more “antifragile” population when it comes to information intake, but I’m not sure how this doesn’t ultimately devolve into chaos.

Nobody can keep track of *everything* which is why we do actually need experts in things like medicine, law, science, etc. The true failure of this age isn’t misinformation, but the abandonment of the pursuit of truth by those who should be the most rigorous in pursuing it, in favor of ideology. I feel like that’s the root issue, but I’m not sure how to fix it.

Expand full comment

Humans as social beings are naturally wired to trust others. If we have to ingrain people to come out of this natural inclination and not trust anyone/thing by default, we risk significant disruptions in the society.

Also, trust is a significant productivity enhancer. It is much easier to carry out transactions in a high-trust ecosystem, than in a low-trust ecosystem. So we will be imposing significant costs if we take this route of making people ever-skeptical.

Expand full comment

Great post! While I do agree with you that perhaps the most effective way of "conditioning" people to bullshit is to make them play in it, it still makes me sad.

Lex Fridman spoke about a similar topic on his podcast a few weeks ago. If we were to discover life on another planet, which would be the greatest discovery in human history, only half of the country would believe it.

There something about humanity becoming more cynical and skeptical that makes me worried.

Expand full comment
Apr 25, 2023Liked by Gurwinder

Gurwinder, tu es toujours très intéressant et pertinent dans tes analyses

Une illustration :

Mes fils (14 et 17 ans) sont exposés continuellement au bullshit d’internet, ils sont très méfiants et critiques sur tout ce qu’ils voient, ce qui est bien.

La différence avec nous c’est qu’ils ne sont jamais choqués par ces mensonges, qui font partie de leur environnement et ne voient pas l’intérêt de lutter contre ça. Pour eux c’est normal, et ils n’y prêtent pas attention. Ils semblent ne pas trouver que la « vérité » soit une valeur à défendre comme un symbole absolu

Mais dans la vraie vie, ils ont pourtant les mêmes valeurs que nous, le soucis de l’honnêteté dans leurs relations et dans leurs actes au quotidien

Ce décalage paraît incompatible au départ, puis finalement très rassurant.

Ils ne s’encombrent pas de l’hypocrisie de certaines personnes qui consiste à s’indigner sur internet pour tout et rien, qui dans le même temps et dans la vraie vie se comportent comme des gros connards.

Expand full comment
May 11, 2023Liked by Gurwinder

This feels like a richer exploration of "Firehosing" (https://gurwinder.substack.com/i/109084196/firehosing), but with a completely different outcome. In that brief explanation, bombardment of a population with disinformation results in apathy or withdrawal, whereas here, you're suggesting it might work a benefit of (at least the potential for) better engagement. In your assessment, does the vaccine metaphor continue to hold (i.e., there's a window of exposure that promotes health rather than degrades it)? Is that even possible in today's world? (I feel like it's already exceedingly difficult to find truth among an already vast sea of agenda-driven and manipulative narratives.)

Expand full comment

That's an excellent point about regulation - you just cannot regulate or control the flow of information anymore unless you're in China. Plus, even if the US wanted to try and regulate it, it's almost a certainty that our elected officials couldn't agree on anything anyway so we might as well go with your idea of mithridatism.

Expand full comment

Let me know if you'd be interested in writing an Op-Ed Guest post on AI Gurwinder for A.I. Supremacy Newsletter. We have an audience of soon 30k.

I suspect you could write a fairly interesting piece on the prospects of Generative A.I. and what it might do to us.

https://aisupremacy.substack.com/p/guest-posts-on-ai-supreamcy

Expand full comment

Have you heard of the concept of anti-fragility coined by Nassim Taleb. You've applied it very well here to human information processing. I feel many elites miss this concept when trying to justify central control of the masses. They underestimate human propensity to learn and grow through adversity.

Expand full comment

Ah ok! Thank you - you're a great writer!

Expand full comment

Thank you for writing such good articles and producing Eternalised (I assume this is also you). I am a huge fan of yours!

Expand full comment

ok I really object to calling Wikipedia left liberal. You are calling truth itself left liberal. Its leads to the same mistakes that mainstream 'left liberal' publications make, giving space to conservative voices just to balance it out. Even if what they say is complete horseshit.

Expand full comment

Here’s my question: is censorship inevitable? Can we cull the tide of the human desire to control? How can we, the little substack writers and lay-intellectuals, hope to stop the combined forces of the censoring elites of the world?

I myself live in Qatar, a place where everyone has resigned to the fact that they don’t have any control over their lives. I try not to fall victim to the same mentality, but it can be hard to be optimistic. So, what’s the solution?

Expand full comment

Most people are fundamentally not very bright and just want to watch funny videos on their phones. I'm not convinced your average joe is going to care about sifting out the truth from the bs.

Expand full comment

"All of this is just the beginning. The text generator ChatGPT is now the fastest-spreading app in history, and apps for generating image and video are not far behind."

Great post except for the hyperbole. The above two sentences added nothing. There were others in the same vein. So many people are so enamored of "Isn't it amazing?" that I don't read most. I almost stopped reading when I saw those sentences.

Be cool, man. You can still get the message across.

Expand full comment

I don’t think the logic follows. If we treat misinformation as a disease or poison, I agree that vaccination or use of a mithridate can render the body resistant to the adverse effects, but that’s not what you’re advocating for. A small dose teaches resistance, but you’re saying we should stand under a waterfall with our mouths open. If misinformation is the Black Death, you’re saying the best response is to open the city gates. You’re one step short of directly importing plague rats. You’re Father Mulcahy giving the camp dog bowls of vodka until it gets so sick it avoids alcohol entirely. But does that mean we’ll get so sick of misinformation from the internet that we’ll give it up entirely? In favor of what? The post? What will stop people from sending lies through the mail?

I think a good internal bullshit detector is an important tool in anyone’s kit, but you don’t need one to empty a bog of manure. You need a shovel. And boots. And maybe better cattle management.

Expand full comment

But how do you make this reality? We could treat digital spaces the same as physical spaces for the purpose of public accommodations law, forcing all platform operators to accept all comers. Of course opinions are not immutable human characteristics, but they would have to be treated as such in this case. And there would of course be exceptions for illegal content, but not an inch past that line. But then we have the advertiser question to contend with - how do we force advertisers not to pull their ads in the face of political pressure?

Expand full comment