Discussion about this post

User's avatar
Mike's avatar
Oct 8Edited

Great post, Matt!

The security breaches you mention are already happening, from government agencies no less!

See https://www.abc.net.au/news/2025-10-06/data-breach-northern-rivers-resilient-homes-program-chatgpt/105855284?utm_source=substack&utm_medium=email which I found in Sarah Smith's "Totally a Thing" on Substack

Sarah's write-up on it is here: https://totallyathing.substack.com/p/the-northern-rivers-data-breach-shows?utm_campaign=email-half-post&r=1urodg&utm_source=substack&utm_medium=email

Expand full comment
Matt Cook's avatar

The Chinese models are incredibly efficient compared to the Western ones. I run one on my Mac that is as good or better than 4o and it’s open weights, so I paid nothing for it. The Chinese API costs are a fraction of those of OpenAI, Claude et al. And I believe the future lays in local AI models, not giant cloud based ones.

We’ll all be running numerous AI instances on our phones, and those of us who use computers will be running them there. They work great, they are private, and they are ours.

Compare to the history of computing. We are in the “time share” phase, but will quickly leapfrog out of that, with businesses using API access, almost a commodity at this point, and everyone using their own AIs similar to the PC revolution.

Expand full comment
9 more comments...

No posts