Artificial intelligence is no longer simply a tool; it is a theology. Its rapid deployment across education has taken on messianic urgency, promising salvation through optimization, personalization, and scalability. Its apostles – venture capitalists, platform architects, billionaires with philanthropic gloss – offer not mere solutions, but eschatologies. AI will remake learning. AI will fix inequity. AI will replace teachers. AI will displace the human.

These are not neutral propositions. The forces driving AI transformation in education serve markets, not communities. Their logic is extractive, their models disruptive in the Silicon Valley sense: disruption as demolition of precedent, not enhancement of wisdom. To remain in traditional education today is to place oneself in the devil’s path – standing against the full weight of transformation. To enter tech’s inner sanctum, to design for or profit from the system, is to join the devil’s right hand: powerful, paid, and compromised.

This essay defines that devil, excavates the terrain, and traces the ethical fracture at the heart of modern schooling: whether education will be a public covenant or a private commodity.

The Devil We Face: Techbro AI and Billionaire-Funded Reform

The devil is not AI itself. The devil is the regime governing its deployment: billionaire-funded, platform-led, and obsessed with market logic.

Consider the collapse of AltSchool, a heavily hyped personalized learning experiment backed by Mark Zuckerberg, Laurene Powell Jobs, and other tech elites. Despite raising $174 million, it failed to scale or sustain its vision, closing its schools and quietly pivoting to software. AltSchool wasn’t merely a failed startup – it was a case study in hubris: data-first pedagogy built by technologists who neither trusted nor understood teachers.

AltSchool is emblematic of the broader movement to privatize education through AI-enhanced platforms. A 2024 report from Senator Bernie Sanders reveals a coordinated billionaire effort to defund public schools and expand voucher programs. The goal: erode civic education and replace it with market-driven alternatives – charters, micro-schools, software subscriptions.

The underlying ideology is clear: students are products, teachers are liabilities, and learning is a business model.

This AI-industrial complex operates with the same extractive principles as Uber or Facebook. It does not serve learning. It serves surveillance, scale, and speculative return.

Traditional Education as Resistance

Generated image

Remaining within traditional public or independent education now requires deliberate resistance. It is no longer neutral. It is oppositional.

Schools, when rightly oriented, are not delivery systems for curriculum. They are moral communities. They form citizens, not just workers. They convene pluralism, not personalization. Traditional schools still carry the burden of these commitments, even as they are battered by funding cuts and media delegitimization.

This does not mean AI must be excluded. It means AI must never come first. OECD guidelines warn that AI in education must be human-centered, teacher-led, and equity-grounded. When integrated well – under the authority of educators – it can augment formative feedback, identify instructional gaps, and support multilingual access.

But integration is not leadership. AI cannot set the curriculum. AI cannot mediate values. AI cannot form conscience.

We already see what happens when it does. In Houston ISD, under a state takeover, AI-driven curriculum platforms produced plagiarized and inaccurate lessons, demoralizing teachers and sparking mass resignations. This is not modernization. It is institutional sabotage.

To remain in the path of the devil is to absorb his blows while defending education as a civic trust. The work is grueling. The cost is real. But so is the mandate.

The Right Hand: Serving the Devil to Build Safe Havens

Generated image

Others choose to work from inside the regime – joining the platforms, taking the capital, building counter-models with the proceeds. This is the position of the devil’s right hand: powerful, well-funded, and often well-intentioned. But deeply entangled.

This is precisely what many billionaire reformers have done. Laurene Powell Jobs now funds XQ Super Schools, elite public-private hybrids with selective admission and tech-forward pedagogy. Elon Musk built Ad Astra for his own children, outside state oversight. Peter Thiel pays young people to drop out of school altogether.

These figures destabilize public systems while creating gated academies for their own. It is a pattern of elite exodus disguised as innovation.

Educators who join the AI-industrial complex – by writing prompts, building apps, or training models – often do so to gain financial security. Some use those resources to build safe-harbor schools of their own. But this strategy is not clean.

To build on techbro spoils is to risk importing their ideology. Without structural guardrails, even counter-models replicate the very inequities they claim to resist. Selective admissions. Algorithmic sorting. Over-surveillance. Privatized success.

The question is not whether you can build something better from inside. The question is whether you can do so without becoming the very thing you sought to replace.

AI as Tool, Not Master: A Necessary Clarification

Generated image

AI has a place in education. But it must be a peripheral place. It must be a servant, not a sovereign.

AI literacy is critical: teachers and students must understand what AI is, how it works, where it fails, and what biases it encodes. Tools must be explainable, auditable, and transparent. Not just functional.

Research from Stanford and OECD confirms: AI used without pedagogical design undermines trust, displaces human insight, and reinforces inequality.

Generative AI may accelerate writing, summarizing, or grading – but it does not teach. Studies reveal that overreliance leads to surface-level thinking, intellectual passivity, and substitution of synthesis with sampling. It creates speed, not depth.

Worse, it encodes existing injustice. Data-driven learning systems often reproduce systemic bias – tracking students by proxy variables like zip code, device usage, or lexicon. Without intervention, AI doesn’t fix inequity. It hardens it.

The only ethical use of AI in schools is one that is teacher-driven, community-accountable, and structurally transparent. Anything else is extraction disguised as personalization.

Implications: Stratification, Commodification, and Collapse

Generated image

The implications of AI-first schooling are already visible.

Stratification is accelerating. Affluent families use AI to supplement tutors, consultants, enrichment platforms. Poorer districts receive stripped-down platforms with minimal human contact. The result is a two-tier system: creative autonomy for the rich, automation for the rest. As Business Insider notes, AI reveals and widens existing cracks.

Commodification is rampant. Learning becomes engagement metrics. Curriculum becomes content. The student becomes the product. Scholar Kenneth Saltman calls this the “alienation of fact”: knowledge detached from purpose, reduced to data point.

Collapse follows. AI-first platforms repeatedly fail to deliver pedagogical value. AltSchool collapsed. Summit Learning has faced backlash over its depersonalized, screen-heavy models. Even the most capitalized projects stumble when they treat teachers as obsolete.

This is not an accident. It is a design failure rooted in ideology. When pedagogy is treated as product design, and learning as UX flow, the human heart of education dies.

Ethic of Use

Generated image

Strategic Orientation: What to Do

The question is not whether to use AI. It is how, and under what terms.

For those staying in the path:

  • Protect teacher autonomy.
  • Advocate for human-first pedagogy.
  • Refuse AI-centered curriculum design.
  • Build AI literacy among students as a civic skill, not a shortcut.

For those in the right hand:

  • Use capital to build transparent, democratic academies.
  • Design governance models with community voice, not just founders.
  • Center admissions on equity, not exclusivity.
  • Use AI minimally, with human review, and open documentation.

For both:

  • Establish school-level AI ethics boards.
  • Require transparent disclosure of AI-generated content.
  • Train all staff in the limitations and affordances of algorithmic tools.
  • Ensure that culturally responsive pedagogy is not displaced by AI neutrality.

This is not a rejection of AI. It is a rejection of its dominion.

The AI-industrial complex seeks to rewire education in its own image: fast, efficient, scalable, and empty. It promises personalization but delivers surveillance. It promises equity but delivers tracking. It promises wisdom but delivers speed.

If schools surrender to this theology, they will cease to be places of human formation. They will become nodes in a data economy.

Those who stay in the path do so with bloodied hands but clean hearts. Those who join the right hand must tread carefully, extracting without becoming extractive.

Education is not a market. It is a covenant. And the devil keeps no covenants.

References:

  1. AltSchool collapse and tech billionaire funding:
    https://www.insidehook.com/culture/tech-billionaires-wasted-millions-on-failed-education-startup-altschool
  2. Sanders 2024 report on billionaire privatization of education:
    https://www.sanders.senate.gov/press-releases/news-new-report-on-the-coordinated-effort-by-billionaires-to-dismantle-the-american-public-school-system
  3. XQ Super Schools and billionaire influence:
    https://truthout.org/articles/billionaires-who-aim-to-disrupt-education-may-get-a-chance-even-if-trump-loses
  4. Elon Musk’s Ad Astra school and tech privatization:
    https://www.newyorker.com/magazine/2016/03/07/altschools-disrupted-education
  5. Houston ISD AI-plagiarized curriculum and teacher exodus:
    https://www.houstonchronicle.com/opinion/outlook/article/hisd-state-takeover-mike-miles-ai-prof-jim-20359937.php
  6. OECD guidelines on equity-focused AI in education:
    https://www.oecd.org/en/publications/the-potential-impact-of-artificial-intelligence-on-equity-and-inclusion-in-education_15df715b-en.html
  7. Stanford discussion of AI trust, literacy, and pedagogy:
    https://news.stanford.edu/stories/2024/09/educating-ai
  8. FT analysis of AI’s impact on critical thinking:
    https://www.ft.com/content/adb559da-1bdf-4645-aa3b-e179962171a1
  9. Business Insider on AI and educational system fragility:
    https://www.businessinsider.com/ai-reveals-how-broken-our-education-system-is-economist-says-2025-7
  10. Kenneth Saltman on commodification and “alienation of fact”:
    https://files.eric.ed.gov/fulltext/EJ1297432.pdf
  11. Critique of Summit Learning model:
    https://www.edsurge.com/news/2019-06-25-as-demand-for-personalized-learning-grows-summit-learning-expands
  12. AI literacy overview (Wikipedia):
    https://en.wikipedia.org/wiki/AI_literacy
  13. Research on data-driven education bias (arXiv):
    https://arxiv.org/abs/2301.01602