4 Comments

Interesting piece, and not just because you quoted me, it may need a reread. The idea of language as the human OS goes back a few decades. Neil Stephenson used it in Snow Crash, and it had already entered popular culture in the 70s with the pseudo-science of Neural Linguistic Programming. I am not sure where it started, so it may be older. I am leery of computer metaphors for the brain or mind, but there is something to the basic idea. If I were to try to refine it, though, I would we it as more of a high-level programming language or languages. I do believe that language is a technology, one of our very earliest, along with fire and sharpened objects. The observation you make about the importance of humans and computers now having a shared OS is important. It seems obvious when you put it that way, but I had not thought of that before. I do not know how much I agree or disagree with you yet, but you have given me food for thought, and that is always appreciated.

Expand full comment

Thanks Guy! I don't like the brain-computer metaphor too much either. But I also cannot completely dismiss it.

I can't wait to hear your thoughts or any response/rebuttal you might have for the shared OS concept once the idea has marinated enough. Once it surfaces, please share!

Expand full comment

Great food for thought. I have a sense that power (aka agency) is still a function of knowledge, and anything less is simply the illusion of choice (aka one of the oldest tricks in the parenting hanbook).

For example I can ask my friendly neighbourhood AI for three different R code solutions to achieve an outcome, but because I never gor past learning to define an object in R before deciding I could just crack on with AI's assistance, I have no idea which option is the best (or will even work until I try it).

But AI has no real agenda (yet, that we know of), so effectively I'm not asking a parent to give me some reasonable options, but handing over my agency to a random answer generator. And not learning nearly as much in the process.

I always get much better AI answers and assistance when I know more in the first place...

Expand full comment

Thank you Dean. I would add some spice to your thought:

Mortimer Adler proposed in his Paideia Proposal in 1982 -- in his pursuit to reform K-12 education -- that education should have three aims: acquisition of knowledge, formation of habits and skill, and the development of wisdom and understanding.

This powerful trifecta (knowledge, skill, wisdom) is probably the biggest determinant of agency.

Interestingly, what Mollick found with his Framework GPT is something that I experienced too. We all live in information overload, which means that you have a lot of unorganized information in your mind. Applying the right frameworks to just structure your thinking can unlock new insights without adding new information.

That can be a powerful tool for learning. Like defragmentation in Windows.

Expand full comment