11 Comments
User's avatar
Mike's avatar
Oct 8Edited

Great post, Matt!

The security breaches you mention are already happening, from government agencies no less!

See https://www.abc.net.au/news/2025-10-06/data-breach-northern-rivers-resilient-homes-program-chatgpt/105855284?utm_source=substack&utm_medium=email which I found in Sarah Smith's "Totally a Thing" on Substack

Sarah's write-up on it is here: https://totallyathing.substack.com/p/the-northern-rivers-data-breach-shows?utm_campaign=email-half-post&r=1urodg&utm_source=substack&utm_medium=email

Expand full comment
Mike's avatar

Sarah's very readable and valuable Substack is here: https://totallyathing.substack.com/

Expand full comment
Matthew Hughes's avatar

Just subscribed to her stuff! Thanks man!

Expand full comment
Matt Cook's avatar

The Chinese models are incredibly efficient compared to the Western ones. I run one on my Mac that is as good or better than 4o and it’s open weights, so I paid nothing for it. The Chinese API costs are a fraction of those of OpenAI, Claude et al. And I believe the future lays in local AI models, not giant cloud based ones.

We’ll all be running numerous AI instances on our phones, and those of us who use computers will be running them there. They work great, they are private, and they are ours.

Compare to the history of computing. We are in the “time share” phase, but will quickly leapfrog out of that, with businesses using API access, almost a commodity at this point, and everyone using their own AIs similar to the PC revolution.

Expand full comment
No's avatar

I’ve had similar thoughts, I’m just sceptical we can get there. I hope we do though - not least because it torches any semblance of a business model that OpenAI may have.

Expand full comment
Francis Turner's avatar

More gloom on the AI space from the reg here - https://www.theregister.com/2025/10/08/more_researchers_use_ai_few_confident/

"Go just a bit further into the data, however, and you'll find that every single question Wiley asked about obstacles to AI adoption has a higher number of people concerned about the technology.

Sixty-four percent of researchers are worried about inaccuracies or hallucinations, up from 51 percent in 2024, and 58 percent are concerned that AI models might not be properly secured or private, up from 47 percent the year prior. Concerns over the ethics of AI usage and AI tool transparency also rose, but only by a few percentage points each. Regardless, they're still near or above the 50 percent mark, suggesting concerns over AI's use in research are widespread and growing. "

Expand full comment
blake harper's avatar

Great post, totally agree, with one exception. I was 100% unsurprised when I read the Deloitte headline. The partners and solution owners on projects worth that much don't spend that much time reading or checking the citations, and the analysts and consultants who pull them together are encouraged to use AI.

Plus, Deloitte doesn't exactly have a reputation for quality in the management consulting world. This isn't their audit and accounting practice we're talking about (that would be a bigger deal).

The challenge is just that you're required to check every one of the answers LLMs give you. In some cases, this saves time. But in most cases it's still quicker to do a search or ask an expert.

Expand full comment
Douglass Truth's avatar

how much would this change if Deepseek and other Chinese AIs were available here?

Expand full comment
eg's avatar

All of the pump and dump grifters know this, no doubt — they’re just betting on getting a bagful and getting out before the whole unproductive spectacle of musical chairs collapses in upon itself …

Expand full comment
Francis Turner's avatar

This is another excellent way to try and figure out if the money is there.

Allow me to do some similar back of the envelope calculations

I think the $20/ month => $240 / year rate is a pretty good guess for what people will pay for an AI subscription. My guess is that $20/person/month is going to work out as the average end price for LLM AI once you munge all the tiers together from free through to "executive premium" and include how API pricing goes back to actual users of the machines generating the API calls.

To make $5B/month from $20/month subscribers means you need 250M of them. In other words something like the entire US population from babies to centenarians plus a bit or, as your figures have it, 5/6ths of Netflix' global subscriber base.

And we note that that $20/month is in addition to all the other things people and businesses pay for like mobile phones, internet, Netflix, Amazon Prime etc.

It doesn't pass the smell test does it?

Expand full comment