Been listening to Slowbeef's "Visual Novel Book Club", which is what it sounds like: 5 people chatting about a visual novel they play between sessions. They went on a tangent about the Star Trek Teleporter dilemma.
It's a problem I used to encounter a lot in sci-fi short stories, but I hadn't considered it much recently. Then I watched a humorously animated short story about it:
https://www.youtube.com/watch?v=pdxucpPq6LcIt was pretty standard up till 3:10, where it added an interesting question: What if the clone was painlessly destroyed instead? It's pointless, but is it actually worse than not using the machine? Why would destroying a thing be worse than never creating it?
Of course, if you believe there's some part of a human that isn't being copied, the argument doesn't apply. I'm not sure about that. Am I a spiritualist, or merely a coward?
My favorite two short-stories that used this trope:
A man wakes up in a computer simulation. He knows this because he just scanned himself. Again. After illegally disabling the opt-out self-destruction routine. He was determined to figure out why his copies all kept using it. He had a girlfriend (but now he's a copy).
A man is created at a distant space colony. FTL travel is impossible, only information. His "original" keeps licensing his engram for these outworld projects. I don't remember this one clearly, but it addresses the video's 3:10 point: If *you* are guaranteed to be fine, is it amoral to spawn a clone in a worse situation? Not even hellish in this case, morale is a recognized concern, it's just hard
work in a new world. Cut off from everyone you ever knew. (I think the story involves him discussing all this with his original for some reason).
Edit: I do like that there's an appropriate SMAC quote in the comments: "And what of the immortal soul in such transactions? Can this machine transmit and reattach it as well? Or is it lost forever, leaving a soulless body to wander the world in despair?"