The Denver Gazette

Don’t let our schools become software

Donald Sweeting, Ph.D., is an educator and former president and chancellor of Colorado Christian University.

The mere presence of laptops and tablets does not automatically improve learning. In fact, the broader research on digital devises in schools is mixed, and major international reviews have warned that digital tools often distract as much as they help when poorly integrated.

The Chicago Tribune recently asked, in light of a new teacher-free AI school coming to Chicago and to big cities around the country: “What is lost from the traditional classroom?” The article suggested that what disappears is boredom, inefficiency, and lack of mastery. But that answer reveals the deepest problem with this new educational vision. It assumes that the highest good of education is efficiency.

It is not. Education is not a factory, and students are not machines to be optimized.

I am not a Luddite or an AI alarmist. Artificial intelligence can do some amazing things in education. It can help personalize practices. It can speed up research. It can provide immediate feedback. It can assist with tutoring, remediation, and administrative burdens. Used wisely, it can be a helpful tool. But a tool is one thing; a substitute is another. The moment we move from “AI in schools” to “school as software,” we cross a significant line.

What is lost first is the human element.

A real classroom is not merely a place where information is delivered. It is a place where human beings are formed in the presence and with the help of other human beings. A teacher does far more than present content. A teacher reads faces, senses confusion, notices discouragement, restrains arrogance, draws out the shy student, and models maturity. A teacher is not simply a content provider.

A teacher is an intellectual guide, a moral presence, and an embodied example of patient authority.

That cannot be replicated by software, no matter how sophisticated. We are told that these schools still have adults in the room, only they are called “guides.” But guides on the margins are not the same as teachers at the center. If the heart of instruction is turned over to screens and algorithms, something irreplaceable has been displaced. Even UNESCO has warned that technology in education should complement face-to-face teaching, not replace it.

What is lost next is deep learning. Much of the rhetoric around AI schooling celebrates speed: faster mastery, faster progress, faster feedback. But speed is not the same thing as education. Some of the very things techno-optimists dismiss as “inefficiency” are actually central to learning: wrestling with a hard text, sitting with confusion, revising clumsy sentences, listening to others, and learning how to think before speaking.

Deep reading, deep thinking, and deep writing usually require slowness. They require attention. They require struggle. They require a sustained encounter with reality that does not bend instantly to the student’s preferences. If every educational frustration is treated as a glitch to be engineered away, we will not form thoughtful students. We will form restless ones.

This is especially important when the model is screen-heavy. The mere presence of laptops and tablets does not automatically improve learning. In fact, the broader research on digital devises in schools is mixed, and major international reviews have warned that digital tools often distract as much as they help when poorly integrated. OECD reporting has found substantial classroom distraction associated with digital devices, and researchers quoted in recent Chicago-area coverage have cautioned that this AI-schooled model is essentially an open experiment rather than a proven educational advance.

What is also lost is transparency about power.

AI lessons do not descend from heaven. Someone designs them. Someone chooses the questions. Someone defines mastery. Someone decides what counts as progress, what kinds of answers are rewarded, what intellectual habits are reinforced, and what moral or cultural assumptions are embedded in the system. The old classroom was never neutral, but at least its authority was visible. Parents knew there was a teacher, a curriculum, a principal, and a school board. In an AI driven model, authority becomes more opaque. Power shifts quietly from the local classroom to the distant designer, programmer, and platform.

And then there is the oddity of the price tag.

If this model is really about efficiency, why does it cost $55,000 a year? If AI dramatically reduces the central role of teachers, why is the result a premium product for affluent families? That question matters. It suggests that what is being marketed here is not simply better learning, but a curated educational experience wrapped in the language of disruption. Whatever this is, it is not an obvious answer to the real educational needs of most families.

There is a larger principle at stake. Education is not just about information transfer or measurable mastery. It is about formation. It is about learning how to pay attention, how to reason, how to listen, how to live with others, and how to become the kind of person who can use knowledge well. The classroom, at its best, is not inefficient. It is human.

AI may yet become a valuable servant in education. But it is a poor master. When school becomes software, we may gain speed, customization, and data. But we risk losing the very things that make education worthy of the name: human presence, intellectual depth, moral formation, and the slow shaping of the soul.

OP/ED

en-us

2026-04-19T07:00:00.0000000Z

2026-04-19T07:00:00.0000000Z

https://daily.denvergazette.com/article/281998974039930

Colorado Springs Gazette