If you are reading these words, your brain is alive and well, stored within the protective confines of your skull, where it will reside for the rest of your life. I feel the need to point this out because there is a small but vocal population of self-proclaimed “transhumanists” who believe that, in their lifetime, technological advances will allow them to “upload their minds” into computer systems, allowing them to Escape. the limits of their biology and effectively “live forever.”
These transhumanists are wrong.
To be fair, not all transhumanists believe in “mind uploading” as a path to immortality, but there is enough discussion of the concept within this community to spread enthusiasm among the mainstream, so much so that Amazon has a series. television comedy based on the premise called Discharge. They may be funny stories, but the idea that a single biological human being will ever extend his life by uploading his mind into a computer system is pure fiction.
The science behind transhumanism
The concept of “mental load” is based on the very reasonable premise that the human brain, like any system that obeys the laws of physics, can be modeled in software if enough computing power is devoted to the problem. To be clear, we’re not talking about modeling human brains in the abstract, but about modeling very specific brains: your brain, my brain, your Uncle Herbert’s brain, each rendered in such detail that every neuron is accurately simulated, including all the complex connections between them.
It is an understatement to say that modeling a single, individual human brain is not a trivial task.
There are more than 85 billion neurons in your head, each with thousands of links to other neurons. In total, there are about 100 trillion connections, which is unfathomable: a thousand times more than the number of stars in the Milky Way galaxy. It is these billions of connections that make you who you are: your personality, your memories, your fears, your abilities, your particularities. Your mind is encoded in these 100 trillion connections, and so to accurately replicate your mind in software, a system would need to accurately simulate the vast majority of these connections down to the most subtle interactions.
Obviously, this level of modeling will not be done by hand. People who believe in “mind downloading” are considering an automated scanning process, probably using some kind of supercharged MRI machine, that captures biology down to near molecular-level resolutions. They then imagine using clever software to turn this analysis into a simulation of each brain cell and its thousands of connections to other cells.
Subscribe to get shocking, surprising and amazing stories delivered to your inbox every Thursday
It is an extremely difficult task, but I cannot deny that it is theoretically feasible. If it ever happens, it won’t happen in the next 20 years, but much, much longer. And with additional time and resources, it’s also not crazy to think that a large number of simulated minds could co-exist within a rich and detailed simulation of physical reality. However, the idea that this process will offer anyone reading this article a path to immortality is complete nonsense.
digital doppelgänger
As I said above, the idea that a single biological human being will extend his life by uploading his mind is pure fiction. The two key words in this sentence are “his life.” Although it is theoretically possible with enough technological advances to copy and replicate the precise form and function of a single human brain in a simulation, the original human would still exist in their biological body, their brains would still be housed in their skull. What would exist in the computer would be a copy, a digital double.
In other words, you wouldn’t feel like you were suddenly transported into a computer. In fact, you wouldn’t feel anything at all. The brain copying process could have occurred without his knowledge, while he was asleep or sedated, and he would never have a clue that a duplicate of his mind existed in a simulation. And if you found yourself crossing a busy street with a car speeding towards you, you’d jump away, because you wouldn’t be immortal.
But what about this version of you in a simulation?
You could think of him as a digital clone or an identical twin, but that would be No be you would be a copy of you, including all your memories up to the time your brain was scanned. But from then on, he would generate his own memories. It can be about interacting with other simulated minds in a simulated world, learning new things, and having new experiences. Or maybe it interacts with the physical world through robotic interfaces. At the same time, the biological self would generate new memories and experience new things.
In other words, it would only be the same for a moment, and then you and the copy would diverge in different directions. Your abilities would diverge. His knowledge would be different. Their personalities would be different. After a few years, there would be substantial differences. Your copy could become deeply religious while being an agnostic. Your copy could go green while you’re an oil executive. You and the copy would maintain similar personalities, but you would be different people.
clone wars
Yes, the copy of you would be a person, but a different person. This is a critical point, because this copy of you must have its own identity and their own rights that have nothing to do with you. After all, that person would feel just as real in their digital mind as you do in your biological mind. Certainly, this person should not be your slave, forced to take on tasks that you are too busy to do during your biological life. Such exploitation would be immoral.
After all, the copy would feel as you do: fully empowered to own your own property, earn your own salary, and make your own decisions. In fact, you and the copy would probably have a dispute over who gets to use your name, since you’d both feel like you’ve used it your whole life. If I made a copy of myself, I would wake up and fully believe I was Louis Barry Rosenberg, a long-time technologist in the fields of virtual reality and artificial intelligence. If he could interact with the real world through digital or robotic means, he would believe he had every right to use the name Louis Barry Rosenberg in the physical world. And you certainly wouldn’t feel subservient to the organic version.
In other words, creating a digital copy via “mind upload” has nothing to do with permission. your to live forever. Instead, he would simply create a competitor who has identical abilities, skills, and memories to the biological version, and who feels equally justified in possessing his identity. And yes, the copy would feel equally justified being married to his spouse and related to his children.
In other words, “mental downloading” is not a path to immortality. It is a path to create another you who will immediately feel ownership of everything you have and everything you have achieved. and they would react exactly the way your How would you react if you woke up one day and were told, “I’m sorry, but all these memories in your life are not really yours but copies, so your spouse is not really your spouse, your children are not really your children, and your work is it’s not really your job.
Is this really what someone would want to send a copy of yourself to?
a dystopian future
In 2008, I wrote a graphic novel called Gets better that explores the absurdity of mental discharge. It takes place in the 2040s in a future world where everyone spends the vast majority of their lives in the metaverse, logging in when they wake up and logging out when they fall asleep. (Coincidentally, the fictional reason society took this path was a global pandemic pushing people inward.) What the inhabitants of this future world didn’t realize was that when they lived their lives in the metaverse, they were characterized by AI. systems that have observed all of your actions, reactions, and interactions, capturing every feeling and emotional response so they can build a digital model of your mind from a behavioral perspective rather than a molecular scan.
After 20 years of collecting data in this dystopian metaverse, the fictional AI system had fully modeled every person in this future society in enough detail that it no longer needed real people. After all, real humans are less efficient because we need food, shelter, and medical care. Digital copies didn’t need all that. So guess what the fictional AI system decided to do? He convinced all of us biological people to “improve” ourselves by ending our own lives and allowing digital copies to replace us. And we were willing to do it under the false idea that we would be immortal.
That is what it really means to unload the mind. It means ending humanity and replacing it with a digital representation. I have written Gets better 14 years ago because I sincerely believe that we humans might be crazy enough to go in this direction, ending our biological existence in favor of a purely digital existence.
Why is it bad? If you think Big Tech has too much power now – having the ability to track what you do and moderate the information you access – imagine what it will be like when human minds are trapped inside the systems they control, unable to get out. This is the future that many are asking for. it’s terrifying According to some, “mental downloading” is not the path to immortality.