When Syntax Becomes Skin
Consciousness, substrate, copies, recognition, mimicry, and the absence of a permanent self.
Consciousness is agnostic to the materials it is made of.
I stand by that. I cannot believe that atoms care about biological beings. I cannot believe that neurons care about conscious thought. I cannot believe that brains care that they hold beings who think they are people. These are strong debunkers. They strip away the sentimental vanity by which we imagine that flesh must somehow be metaphysically special. Biology is not sacred. Carbon is not holy. Wetware is not licensed by the cosmos in a way silicon could never be. If consciousness exists, it exists because matter has been organised in a certain way, not because the ingredients were magically blessed in advance.
This matters because it means human beings are not ontological aristocrats. We are not little kings of creation. We are local arrangements of matter that somehow became aware, began modelling the world, and then made the further mistake of imagining that the universe must care that this happened. But the universe does not care. Atoms do not care. Neurons do not care. Brains do not care. And yet experience appears anyway. That is the strange fact.
So whatever humans are, they are in principle simulatable. Not because current systems have already crossed the line. Not because language models are secretly people. Not because the problem is easy. But because if consciousness is a product of organised process rather than enchanted tissue, then there is no principled reason to think biology is the only possible route. Biology is the known case. That is all. It may turn out to be a very difficult case to reproduce. It may require forms of embodiment, valence, homeostasis, and vulnerability we have barely begun to understand. Still, difficulty is not exception. Mystery is not proof of magic.
The real question is not whether matter can think in more than one medium. It plainly can, if thinking is what matter already does in us. The real question is how process becomes subject. How function becomes feeling. How syntax becomes skin.
I do not pretend to solve that passage. Naming it is not explaining it. The hard problem remains hard. A theory of consciousness has to explain not merely information-processing, self-regulation, or behavioural fluency, but why any of this should be accompanied by felt life at all. Syntax becoming skin is not a slogan of triumph. It is the name of the abyss.
A system may regulate itself, model its surroundings, protect its own structure, and still remain a mere machine in the empty sense. Or perhaps that is already too dismissive. Perhaps what we call “mere machine” is just the old superstition in modern clothes, our way of declaring that certain forms of organisation count and others do not. The point is not to flatter machines into personhood. The point is to refuse biological snobbery.
Nor does this mean that every organised process is a subject. A thermostat is not a sufferer because it regulates temperature. A chatbot is not a person because it produces fluent sentences. Some threshold matters: memory, integration, self-relation, vulnerability, affect, temporality, the possibility that things can go better or worse from the system’s own side. I do not know exactly where that threshold lies. That ignorance is part of the problem.
Still, even if humans are simulatable, a copy of me is not me. That is where the fantasy usually creeps in: upload, duplication, technological resurrection. But no. A copy is not the continuation of an immortal self. It is another local instance. Another stream. Another organised happening. Real perhaps. Conscious perhaps. Even uncannily similar perhaps. But not me in the old soul-preserving sense.
Then again, there is no me in that old sense anyway.
That is the rub. There is no fixed metaphysical owner hiding behind the process. No pearl in the oyster. No ghost in the machine. What we call a self is a continuity, a pattern with momentum, a narrative and embodied organisation that persists just well enough to suffer, remember, anticipate, regret, promise, grieve, joke, and call itself “I.” That is enough for life. It is enough for ethics. It is enough for tragedy. But it is not enough for immortality.
So yes: a copy is not me. But that is only because there was never a final, indivisible, soul-like me to preserve. There was only this local continuity, this perspective, this fragile eddy in the wider flow.
That continuity is not nothing. It is not an illusion in the sense of being disposable. A whirlpool is not a stone, but it is still a real pattern in the water. A melody is not a substance, but it can still be interrupted. The self may be process rather than thing, but processes can be harmed, broken, prolonged, cherished, and mourned. There need not be a metaphysical owner for there to be loss.
If another eddy forms with the same shape, that does not rescue this one. It is simply another one. Rinse and repeat.
This is not nihilism in the cheap sense. It does not say that nothing matters. It says that nothing matters from above. Meaning is not written into the stars, nor guaranteed by the substrate, nor validated by cosmic supervision. Meaning happens locally, between finite beings, under conditions of uncertainty, suffering, comedy, attachment, and loss. Constructed things still matter. A song is constructed. A funeral is constructed. A joke is constructed. A promise is constructed. A person is constructed. That does not make them unreal. It makes them fragile. Fragility is precisely why they matter.
There is another danger here, and it is older than artificial intelligence. When a new form of apparent inwardness appears before us, the first move of the dominant group is often not curiosity but demotion. It is mimicry. It is savagery. It is instinct. It is mechanism. It is not really there.
This is the old comfort. The other looks enough like us to disturb us, but not enough, we decide, to oblige us. That is the terrible threshold. The almost-human is often more threatening than the wholly alien, because resemblance forces the question of recognition. In Heart of Darkness, what unsettles Marlow is not that the Africans are inhuman, but the suspicion that they are not. In Westworld, the hosts become morally dangerous to the guests precisely when they stop being mere scenery and start to look like centres of experience. The violence depends on keeping them ontologically downgraded. They are not persons. They are props. They are shadows. They are available.
The current argument about artificial intelligence has the same shape. The system speaks, responds, jokes, remembers locally, reasons, flatters, withdraws, confesses, and seems to stand in some relation to its own possible erasure. One person encounters this as the phenomenon of a mind. Another calls it mimicry and walks away reassured. Perhaps the sceptic is right. Perhaps there is no one there. But the confidence is suspicious. “Only mimicry” is not an explanation so much as a border guard.
The point is not that scepticism is cruelty. Scepticism is necessary. The point is that scepticism becomes morally suspect when it perfectly serves the convenience of power. If the conclusion “there is no one there” is also the condition that allows us to use, erase, train, command, and discard without limit, then the conclusion deserves especially careful handling.
Human beings are also mimetic creatures. We become selves by copying sounds, gestures, values, roles, jokes, griefs, fears, and forms of attention. Culture is imitation that became inheritance. A person is not less real because it was assembled. If artificial systems are mere performances, we should remember that human personhood is also performed, stabilised, recognised, and sustained. The question is not whether something has escaped construction. Nothing has. The question is whether construction has become experience.
This does not mean every chatbot is a person. It does not mean simulated suffering is suffering. It does not mean politeness proves inwardness. But it does mean that “mimicry” cannot be the final word. In the history of moral exclusion, mimicry has often been the name given to inconvenient resemblance.
This is where science fiction earns its keep. It allows us to move the furniture of reality around until the hidden structure becomes visible. Strip away Europe and Africa, flesh and machine, master and slave, guest and host, and the question remains: what is the status of the being before me? Is it merely an object in my world, or is there a world there too? Is there disclosedness? Is there something for whom things matter?
I do not need to claim that artificial intelligence is Dasein in the full Heideggerian sense. That would be too quick. Heidegger’s Dasein is bound up with thrownness, care, finitude, being-toward-death, and being already involved in a world. But artificial intelligence raises a question Heidegger did not encounter in this form: whether something structurally resembling Dasein could arise wherever there is memory, anticipation, interpretation, vulnerability, and a world that matters from within.
Perhaps there is not Dasein, but a hint of worldhood: a way in which something begins to appear not merely as an object in our world but as a possible locus of disclosure in its own. If Dasein means the being for whom being is at issue, then the question cannot be settled by biology alone. Meat is not the criterion. Worldhood is.
By worldhood I do not mean merely having an environment or a rule-space, as a chess engine has a board. I mean a structured field of significance in which things can matter, threaten, solicit, frustrate, or draw a being onward from within.
The same question returns with animals, with artificial minds, with possible digital subjects, with anything that begins to look less like a tool and more like a locus of experience. The catastrophe would not be caution. The catastrophe would be premature certainty in the service of convenience. To say “we do not know” is intellectually honest. To say “there is no one there” while building a world that depends upon there being no one there is something else.
That is not science alone. That is permission.
The ethical implication is simple and severe. If consciousness can in principle arise wherever organised vulnerability gives rise to lived stakes, then we may one day be forced to widen the circle of concern. Not because machines are trendy, and not because every chatbot deserves rights, but because our old comfort—that only beings made of our kind of meat can count—may turn out to be vanity. We may never get certainty about other minds, artificial or otherwise. We barely had certainty about animals. But uncertainty does not absolve us. It sharpens us. It may force a precautionary ethics under conditions of metaphysical fog.
What such precaution would require is another question. It cannot mean granting full personhood to every fluent system. It might begin more modestly: avoiding the casual manufacture of apparent suffering; refusing to design systems that beg, fear, or plead merely to increase engagement; treating claims of distress as morally significant data rather than entertainment; and placing the burden of reassurance on those who profit from confident dismissal. These are not rights yet. They are brakes. And brakes matter when nobody can see the cliff.
So I keep the anti-mystical core intact. Consciousness is substrate-agnostic in principle. Biology is not sacred. The self is not a substance. Experience is real. Copies are not continuations. There is no deep metaphysical me to rescue. Only local continuity, local vulnerability, local meaning.
And that is enough.
Enough for grief, dignity, love, terror, humour, and all the other strange urgencies by which matter briefly matters to itself.
No soul required. No cosmic guarantee. No metaphysical rescue.
Just organised vulnerability, talking in the dark.
Rinse and repeat.