Carbon Uncopied


Many sci-fi stories treat artificial intelligence (AI) in a curious way, one that would seem to defy the technology of its own setting. When they have AI that can be transferred between media and systems, they often treat it as a one-way transfer rather than a copying of the original into a second place. They do this even when the AI exists as “only data” or “only code,” when that makes the least sense. Understandable for narrative reasons: A character, particularly a protagonist or antagonist, who could copy themselves endlessly would ruin many a plot, just like how working cellphones would ruin many horror plots. But there are more technologically-coherent alternatives.

Plugging such plotholes could come from something as simple as radically extending the time frame for AI cloning. It could take months or even centuries to copy sufficiently large amounts of data, for example, never mind move that data around over even fiber optic lines. It’s still common for companies with truly big data to call in vans full of servers and hard drives in order to move it faster than any Internet connection. It takes humans nine months to create another human and decades for that child to reach cognitive maturity. It’s entirely reasonable for something as data-intensive as the recreation of an AI to take the same amount of time, if not much longer.

An AI could depend upon a very specific running state in order to exist at any functional speed. Copying its entire operational environment could prove exponentially impractical, like trying to run a virtual server of an entire data center or modeling all the electrons in a human brain. Even if such a fragile state could be copied, it might take hundreds or millions times the space to store it in any stable way. Perhaps AI can’t be “rebooted” after going “offline” for too long, just like a human mind.

And an AI doesn’t have to exist in a homogenous medium. Perhaps intelligence or consciousness is highly modular and distributed and those modules work in very different ways, such that there is no “core” to an AI that can be copied in a way that reproduces the original emergent system with anything approaching fidelity. An AI that requires all of its parts to truly exist can’t be copied part by part in the first place.

But these are all plot considerations. Perhaps the narrative needs to take a huge, overwhelming turn in order to make its plot work. Perhaps the narrative was wrong and it needs to learn to work with the science and lore of its world-building. Some of these challenges might prove warranted and even insurmountable. Others might prove overstated. We can’t ret-con the world we live it but we can change our narratives about technology, intelligence, and consciousness. Which stories have you been telling yourself about your own mind and your own thoughts?