Note from Matt: Sorry for the hold-up on publishing this. This post is only for premium subscribers. To read it, sign up for a paid subscription. It costs $8-a-month, or $80-a-year. You get 3-4 paid-only posts each month, in addition to the free posts I publish each week.
My next post will be a free one, and it’s going to go live this weekend.
Since I launched this newsletter three months ago, I’ve spent a lot of time writing about the platformization of the Internet, and how so many of our interactions online are governed by algorithms that primarily optimize for engagement, rather than any end-user benefit.
One component of this discussion that I’ve spent the past few weeks — too long, really — mulling is how the Internet feels bland as a consequence, more so than the days before YouTube and Facebook effectively became the Internet.
I believe that there was once a thing as “web culture” — a term that has, almost overnight, been forgotten. Furthermore, I believe that the platformization of the Internet has effectively killed that culture, either through homogenizing everyone’s experience through the same algorithmic prism, or by altering the motivations that previously drove people to create digital culture.
I’m breaking with convention here and writing my thesis early in the piece, and in plain, simple terms, simply because I want to anticipate a couple of rebuttals and counter them early.
You’re just being nostalgic. I mean, sure, I fully accept that I’m glancing over my shoulder wistfully at a youth spent on YTMND, or reading Maddox, or on Digg and early Reddit, back when Reddit felt like a secret club. At the same time, I’m not talking about a specific product or website, in the same way that offline culture isn’t a single painting or song.
Internet culture never actually existed. This requires some time to rebut, but I would argue that the existence of shared shibboleths (I’ll get to them later) that stem from the digital realm, as well as shared points of reference, suggest that it, in fact, did exist.
Internet culture still exists, it just changed. To what, exactly? Again, this is a point that I’ll need some time to address, since it effectively goes to the heart of my argument — that internet culture is, in fact, dead.
I also expect that some will argue that the second part of my argument — that the platformization of the Internet played a major role in the death of Internet culture — will also raise some eyebrows. Again, let me put my cards on the table by laying my arguments out ahead of time:
We’ve gone from being an Internet of posters to an Internet of lurkers — and I’d argue that a major factor in that shift has been the adoption of AI-driven recommendation algorithms that optimize for engagement.
If you don’t think anyone will see whatever you create, you’re less likely to create.
Secondly, creators are encouraged to optimize for engagement, which ultimately makes content feel homogenous.
The platformization of the Internet around a handful of platforms — be they TikTok, YouTube, Facebook, or Instagram — has made it harder for smaller sites to attain recognition and relevance.
As a result, web culture today has to effectively operate within the parameters of these platforms — which further contributes to the homogeneity of the content we see.
The introduction of financial incentives — not just on YouTube, but also Facebook and Twitter (again, I refuse to call it X) — further changes the dynamic.
In the case of Twitter, users are rewarded financially for posts that drive engagement — which, almost always, are those that spark anger and outrage.
In the case of Facebook and YouTube, the presence of monetary rewards further encourages users to create the kinds of content that the algorithm likes.
If we accept that internet culture isn’t dead, but just moribund, then generative AI is likely the thing that’ll deliver the final blow.
I’m not merely talking about the consequences of a technology that allows others to monetize the creative works of others, and typically without the consent of said third-party. Generative AI allows for the mass-production of content — really, slop, or another fun word, drek — which then floods the zone, drowning out the humans that are actually creating culture.
The YouTuber f4mi gave an incredible example of how this works (and an entertaining way to actually fight back). In essence, an AI slop merchant will take the subtitles from a person’s video, use generative AI to repurpose it into an entirely different script, and then using a text-to-speech model and some generic imagery, create a brand new video.
Whereas a full-length video may take days — or even weeks, or months — to produce, this approach allows someone to mass-produce content in a matter of minutes.
The annoying thing is that you’ve likely seen this already. Those videos where a robot reads out a post from Reddit’s Am I The Asshole subreddit, while a figure jumps from ledge to pedestal in Minecraft, or hops between trains on Subway Surfers? That’s an example, and one that you’re undoubtedly familiar with.
While the factory farming of content is bad — and I’d argue a major factor in the decline of Internet culture — I’d argue that the effect of this isn’t simply that it makes it harder for human creators to be discovered, but also that it demoralizes people, effectively stopping them from creating in the first place.
As I’ve said in previous pieces, AI-generated content is bad on an aesthetic level. It makes everything look shitty. And why would you bother making something in a space that looks and feels shitty, because it is shitty?
The reason I decided to condense both my thesis and my argument in the first part of the newsletter is because I recognize that this topic is going to invite fierce argument, and that it’s also really complicated. There’s a lot of stuff that lacks a common, agreed-upon definition (like “Internet culture”).
So, let’s start there.

