Discussion about this post

User's avatar
Lee M. LLoyd's avatar

I feel like you are fundamentally missing the point of generative AI, from the point of view of the companies investing in it. This was never once about the quality of the output, it's about the velocity of the output. The goal has never once been to make content that is good. The goal has been to make content that is good enough and fast enough, to bury human generated content under a mountain of AI generated garbage. Thus devaluing the human labor, to make the market more favorable to the tech giants controlling the direction of the market.

Let me give a simple example. Let's say you are a talented programmer, who happens to have specific knowledge of a niche market. You write a tight, robust, efficient application, that addresses several common pain points of that niche market, in a desktop tool, that runs locally, which you sell at a reasonable price. Traditionally, this is a fantastic recipe for success, for you, the developer.

But that desktop tool doesn't push subscriptions. It doesn't drive AWS or Azure contracts. No VCs get returning revenue from a SAAS model. Meta, Google, Amazon, Microsoft and so on, dont make a penny from it. It is useless to Silicon Valley, from their point of view. But what if every single company in that niche market, instead of buying your tight, robust, efficient application, could instead be sold on a subscription, to have an AI make them a bloated, buggy, staggeringly inefficient, tailored solution to their personal specific problem?

Now we're talking! Data center contracts. Recurring revenue. GPU sales for lots of compute! Sure, it is in every way a worse solution to the same problem, but when everyone in the entire market is making their own bespoke solution, how is anyone ever going to find your tight, efficient, robust application?

That's their perspective. Not that the output of AI is good. It's that the output of AI is good FOR THEM.

Expand full comment
Lyrical Cleric's avatar

Any teacher who’s had to read and grade an AI paper can tell you that it’s awful—generic, boilerplate language, hallucinated quotes and sources, and SHORT. Trying to get AI to stay on task with any train of thought is impossible. If you ask the AI to write a 250 word blurb it can do it; a 500 word response is probably going to repeat itself a bit; and a 1000 word paper will say the exact same argument every paragraph, no variation or increasing depth. There is just no depth to it. It’s wide as an ocean, deep as a puddle, and it’s piss. It’s a puddle of piss.

Expand full comment
13 more comments...

No posts