Problems with mind uploading
Whether or not any form of mind uploading to go beyond biological immortality and reach as posthuman existence is possible is increasingly seen as a set of technological problems, especially by transhumanists. None the less, there are a number of technical and philosophical problems with mind uploading currently inadequately addressed.
Death, no matter what some people say is bad. Dying whilst literally trying to live forever would bad too. Destructive methods that kill you and replace you with a duplicate that thinks their you might not be enough.
Left over copies
Most forms of duplication of yourself lead to the problem of having an extra version of yourself which, in scifi at least, tends of quickly get killed off.
However this is even more problematic when you consider things such as non-destructive uploading, you can disagree with you, which you is the real you. Breaks in consciousness are scary.
See also: TvTropes:Expendable Clone
What about the soul?
For the purposes of secular discussion we usually discount the possibility of a soul as immeasurably things cannot by our current understanding of the laws of nature affect or interact with us.
For deeper exploration of this issue religious transhumanism may have the answers for you.
Non-materialist physicalism renders the mind uncomputable
Perhaps there is something irreducibly complex at the quantum level about the nature of the mind and consciousness that for computational reasons cannot be transfered, modified or emulated. This would require rejection of the Church–Turing thesis's applicability to the human brain. Similarly, the hard problem of consciousness may stand in the way here.
Loss of identity coherence
Even if a form of consciousness continuity is possible, the new substrate may not be able to function is such a way that 'you' are 'you' for much longer. This could lead to anything from becoming a schizophrenic vegetable to a human hating AI-like paperclip maxmizier within seconds.
Is your mind your intellectual property? Because you may run into all sort of problems.
First of all, what if a powerful AI scanned you or even ran a powerful simulation of you in order to see how you'd respond to millions of scenarios? How do you know you're not in that simulation already?
But forget about AI, what about criminals, reckless relatives, the government? There could be no end of people who want a piece of your mind!