I’ve found myself wrestling with two disturbing questions: Firstly, does learning itself obey the laws of material reality, or does it resist them? And secondly, if it obeys them, then does the emergence of artificial intelligence mean that teaching itself is fundamentally algorithmic, that what we’ve long considered an irreducible human art is actually a computable process? Not just faster, or cheaper, or more scalable, but genuinely more effective at producing learning?
The optimistic vision is compelling: every child receives expert, tireless, infinitely patient instruction calibrated precisely to their needs. The achievement gap narrows because the students who most need help finally get it, not in sporadic bursts but continuously, systematically. Teachers are freed from the grinding mechanics of delivery and assessment to focus on what machines genuinely cannot do: motivation, relationship, the human dimensions of learning.
But there’s a darker reading of AI, one I think we should take seriously. If teaching becomes demonstrably algorithmic, if learning is shown to be a process that machines can master, then Penrose’s ghost returns with a different question: what does it mean for human expertise when the thing we most value about ourselves, our ability to understand and to help others understand, turns out to be computable after all? Not insight, but instruction. And if instruction is algorithmic, then what exactly is left that makes human teachers irreplaceable?