TL;DR, we're chemical reactions wearing clothing.
Let's suppose you're right. If so, do you see how removing the chemicals might be a problem?
I gave the analogy a couple pages ago: drawing a picture of the relationships involved in an explosion does not result in an explosion. Draw a picture of the molecules that compose gunpowder and draw a picture of a blowtorch...that picture will fail to create an explosion.
Even if you assume that consciousness is a result of physical processes, it's still quite a leap of faith to assume that "uploading" in the sense of creating a software copy of your brain's neural network would also result in consciousness.
Your metaphor would be apt if we were saying that an uploaded consciousness
was a fleshy brain doing what it does, but we're arguing that having a simulation would give rise to the same emergent phenomena as our wetware stuff, which is an important distinction. I doubt anyone would argue that a program on a computer was literally a brain made of water and protein.
I think a better metaphor would be writing an algorithm on paper line-for-line then transferring it to code. To make it more closely bound to the current debate, let us say that transcribing it to code requires us to erase the algorithm on the paper.
We can execute the code on our wetware (well, we could before we transcribed it)- just walk through, again step-by-step, and keep a tally of each variable at each step. We can also run the code on the computer. I would argue that the two algorithms are functionally identical. That is, if they were given the same inputs, that they'd give the same conclusion, including whatever phenomena arise out of the code we've written, such as bugs leading to variables going out of bounds, or a bug causing the whole thing to enter into an infinite loop.
Yes, the code on the computer isn't the
same as it was on paper- it's just hard ones and zeroes, after all, compared to the elegance of letter and number - but it acts the same in practice.