Maybe we should just get away from this and say what actually counts as a "person". Maybe even what counts as "human", because we're going to end up with that eventually anyway. If you could "upload" your brain to a computer, would you still be human? And theoretically, if you could have an AI "switch brains" with a human, would either of them be human?
My thoughts on the definition of a person is basically anything that has self-awareness and is capable of "higher thought". Although the internet says that it's also defined as a human or human being (these two, strangely enough, are actually very different). Oddly enough, I find that a "person" is the easiest to define. I would define human as something with the mind of a human, a computer with an uploaded brain is still human, right? However, a human body with an AI mind is not. Then you get into all sorts of crazy stuff with "human BEING". Wikipedia says that being is defined as one's self concept though, so I suppose that'd be the mind of the body or computer or whatever.
Legally though, could an AI be sued? Or be convicted for a crime? Or would the creator be? Especially if the creator isn't responsible for its actions. Let's face it, in order to have something intelligent enough to be capable of higher thought, it probably has to have free will, otherwise it will probably come up with the same answer as the creator or something.