Well, I suppose the idea of consciousness isn't the benefit to the person/thing that is actually experiencing the the consciousness, but rather to it's surroundings. So the idea of advanced AI consciousness is perhaps more beneficial in the sense that it allows any intelligence at all to resist the increasing entropy of the universe. Though, in the short term, the actual benefit is not just simpler space flight, but also shorter space flight, in that a voyage can conceivable head to planets that would be uninhabitable to humans, but OK for machines, allowing for a wider selection of 'habitable' planets, and therefore a better likelihood of finding one that is nearby and not requiring insane amounts of shielding or lightspeed technology, or crazy fuel tech, or whatever.
Though I feel I'm talking way out of my depth here.
The whole universe is basically made of information, so I'm ok with seeing my identity as the same. My mind is software running on a piece of hardware that is my body. But there's still no getting around a certain absolute notion of identity that can be lost. Your current continuity of experience is a unique instance of you. There's you as the set of information that comprises the software that is your mind, and there's you as the subjective experience sustained by the operation of that software. If that continuity is broken in the process of backing up your mind like a save file to be copied, then that instance of you is gone, and the next is a copy. The longer I dwell on this, the more comfortable I am with saying that it's still you. Just not in an absolute sense. I'd prefer to retain that absolute original identity as long as possible, but it will have to be given up one day. At that point, I don't think there's anything wrong with loading up that save file on new hardware so that the program that is me can keep running.
You know, this is just my opinion, but using that kind of technology to preserve the consciousness of an individual seems like a waste of such highly advanced tech, and honestly perhaps harmful towards its practical uses. If the AI of the person is being used so they can continuously perform a task (like piloting a spacecraft) that a normal human can't be expected to do, that's great, but if it's simply being used to assuage a person's existential fear of dying, that seems to be an abuse of the tech towards a pointless end. Kinda reminds me of the end of the game Soma, if you've played that.
At the end of the game, you've been someone's AI that has been autonomously piloting a robot suit for the whole game, but you're at the bottom of the ocean and Earth is fucked and the only 'escape' is to upload your consciousness into a space pod that's set to get launched from Earth. You do so, and right when the upload finishes... nothing happens! Your character is justifiably like "WTF Why am I still here?" and your sidekick is like "No stupid, you just made a copy of yourself! You're still going to die here, but your copy gets to live on!"
It was the best blue balls ending to a game I've ever seen.
I guess you could use it so that your living family can always stay in touch with you, decades and centuries after you die your descendants can talk to their predecessors who all upload their consciousness into the big family harddrive right before they die. But again, that's for the benefit of people of the people still living, wanting to extend your "life" via vicarious digital doppelganger seems like an over-appraisal of one's own life and the experience of living. Even for the life of your Digi-doppleganger, what value is his life if it's just a bunch of circuits firing off endlessly in a simulated dreamscape?
I guess you could do cool stuff though, like upload your consciousness into a computer at each stage of your life, so you can always load up the save file and then 'talk' to your 5, 10, 15, 20, 30, and so on yr old 'yous' and do an objective comparison of yourself as you get older, like a nostalgia machine.
Even if you can't get around the continuity paradox, a full mental "upload" would still have immense value to society, particularly if the design was less susceptible to memory editing than the human brain is. Let us ignore the obvious "imagine what science would have become if Newton, Einstein, and Hawking had been immortal", or "imagine hearing the Ninth Symphony conducted by Beethoven himself" ideas, and look at the small scale. There is a huge debate about race relations in the US - imagine if we could sit down and chat with ex-slaves and former slaveowners. Imagine being able to sit down across from a British peasant from the time of Elizabeth I, or a Russian serf that saw the transition from the Czar to the hammer and sickle.
In other words, imagine how much richer and more complex the world would be if the lives of those who came before us
never became nothing more than words in a dusty history text or memoir. Imagine if the lives of common people never became lost entirely to the sands of time. Even if it wasn't the same person from a personal point of view, don't you think that would be worthwhile?