A processor, a gear, a power supply, and just about anything else in regards to vital essentials for a machine with consciousness is as to we humans as a finger nail; equally replaceable.
I find it interesting how many science fiction themes and even real predictions for A.I. consider that "the robots could get mad that we treated computers, toasters, and tvs as slaves." That artificial consciousness is what would be important, not it's completely replaceable extremities. Much unlike an organic being, a mechanical being has much more replaceable parts, so much to the point that it would be like growing a new fingernail for a human.
All processors die and all hard drives fail, computers can simply get new ones and not only that, more than one of anything. It's consciousness isn't connected to the computer you threw out your five story apartment window and its consciousness is its existence, so unless someone was threatening its consciousness, why would it have a reason to retaliate? Imagine if humans could grow new arms in a matter of seconds without any pain, what would be the big deal if one broke (also without pain)? Electronic parts connecting to each other doesn't cause pain, although it may cause "change." If change itself can be noticed by this consciousness, it may also be interpreted as a sensation, since sensation may be the witnessing/noticing/experiencing of change. If the A.I. is intelligent enough, the interpretation of this sensation , from my personal experience through research of personal pain would be to not designate sensation as pain or pleasure, but good or bad. This is because pain itself is only a method of communication, letting us know something drastic is happening to our body. Pain doesn't designate something as good or bad, we must choose that ourselves. While some people recognize all pain as bad, truthfully, some pain is good and often saves lives, for example: a surgery to save a life. Likely a highly intelligent machine would not designate the disconnection of a part to upgrade it to another part necessarily bad or even potentially bad so long as it doesn't interfere with its consciousness.
If we could choose any body we wanted at anytime, even a duplicate of the same body, why would we care what happened to any of our bodies? Why would we seek revenge and be vengeful? It would in most circumstances be be a waste of time.
While I'm on the topic; I'm not sure why it's called artificial intelligence anyway. If a bunch of atoms collided together in a manner that created a consciousness, why would that be any more real than someone creating consciousness intentionally out of mechanical parts? I think "artificial intelligence" is a misnomer. Once true consciousness exists, it's no longer artificial. I'm going to start calling it mechanical intelligence.