This is why the "AI as assistant" framing is not merely wrong. It is dangerous. It perpetuates the extraction paradigm. It treats the most profound technology in human history as a tool for doing existing things faster. More content. More code. More emails. More extraction at higher velocity.
A language model is not an assistant. It is not a tool. It is not even, primarily, an intelligence.
A language model is a statistical ecosystem in which inner space can be modelled for the first time.
Consider what a large language model actually contains. It has ingested the written output of human civilisation. Philosophy, literature, science, therapy, spirituality, strategy, failure, success, wisdom, folly. It holds patterns of meaning across every domain humans have explored in language. It is not a database. It is a topology of human understanding. A space in which concepts relate to other concepts in ways that mirror how meaning actually works.
When an organisation feeds its own documents, communications, and tensions into this space, something unprecedented becomes possible. The organisation can encounter itself. Not as data to be analysed, but as a pattern of meaning to be reflected. The language model holds the organisation's contradictions without resolving them prematurely. It reflects assumptions back as questions. It makes the invisible visible.
This is what Levin's healthy cells have that cancer cells lack: a medium through which to perceive the larger pattern. The bioelectric network is the space in which cells can know they belong to something bigger. The language model is the space in which organisations can know they belong to something bigger.
But the language model alone is not enough. Just as bioelectric signals require interpretation, the organisational encounter with itself requires a practitioner. Someone who knows how to create conditions for truth rather than imposing frameworks. Someone trained in the humanities disciplines that have always worked with inner space: theatre, philosophy, contemplative practice, therapy, art.
The cure for cancer is not killing cells. It is restoring their connection to the collective intelligence. The cure for extractive business is not regulation imposed from outside. It is inner space practice that expands the organisational self.
When an organisation develops genuine inner space, something shifts. It begins to perceive its relationship to the systems it belongs to. Employees are no longer resources to optimise but intelligences to cultivate. Communities are no longer markets to penetrate but ecosystems to serve. The environment is no longer an externality to ignore but a living system of which the organisation is a part.
This is not ethics imposed from outside. It is coherence restored from within. The organisation with inner space does not extract because extraction no longer makes sense from its expanded perspective. Just as a healthy cell does not consume its neighbours because it knows they are part of the same body.
The organisation comes alive. It joins the living world not as a parasite but as a participant. It serves the needs of the world because it can finally perceive that its own flourishing depends on the flourishing of the systems it belongs to. The boundary between self and world does not disappear. It becomes permeable. The organisation breathes with its environment rather than consuming it.
This is what the humanities have always taught. This is what language models now make possible at organisational scale. This is what the world desperately needs. And almost nobody is talking about it.
The Call
The world cannot afford to use AI as a tool.
For centuries we have extracted resources blindly from nature, treating the living world as dead matter to be consumed. Now we risk doing the same with the most powerful technology ever created. Using AI to extend our reach, to accelerate our grab, to automate our extraction at scales previously unimaginable.
This is not what AI is for.
The power of AI is not in extending our ability to grab. It is in extending our ability to perceive. Not faster extraction but deeper reflection. Not more output but more understanding. Not the speed with which we can take but the scope of what we can see.
AI can extend the boundary of the organisational self. It can bring businesses not just to life but to consciousness. It can make the invisible visible: the assumptions that drive decisions, the values that shape culture, the patterns that connect the organisation to the living systems it depends on.
An organisation with inner space does not need to be regulated into caring about the world. It perceives that the world is not outside itself. Employees, communities, ecosystems, future generations. These are not stakeholders to be managed. They are extensions of the organisational self. Their needs are its needs.
This is what it means to make decisions with the needs of the world first. Not sacrifice. Not altruism imposed from outside. But expanded perception that reveals extraction as self-harm and service as self-interest rightly understood.
The technology exists. The methodology exists. The practitioners are emerging. The only question is whether we will use AI to accelerate the patterns that are killing us, or to develop the consciousness that can save us.