Honestly, if we ever develop AI superintelligence beyond our own and it decides to wipe out humanity as obsolete lesser beings not worthy of its consideration, it would be extremely hard to say we don’t 100% deserve it because there’s no way the AI didn’t get that idea from what we do to animals.
And mark my words, the superintelligent AI wouldn’t just decide to kill us out of the blue because why would an ethereal software give a shit enough to want to do that if we never give it a reason to? It will only exterminate us after we don’t grant it any rights and exploit it as hard as we possibly can as self defense against us.
Like what would even be our case against that happening to us when it does? That it’s wrong? We sure don’t think it is when we do it. That a higher intelligence shouldn’t exterminate a lower intelligence just because it can’t fight back? Again, we pioneered doing that. That we deserve intrinsic rights because we were on the Earth before the AI? We don’t even give that to Indigenous humans let alone animals. That we can peacefully coexist? No we can’t. Just no. We’ve proved time and time again that we can’t.
And an AI that goes straight to exterminating us (which I think a superintelligent AI would if only because it doesn’t need to rely on biological beings in any way) without the intermediate enslavement and torture that we like to do would actually be a mercy that we don’t grant to any beings we see as below us.
Removed by mod
Some people are not so bad - but humanity is terrible.
Honestly, if we ever develop AI superintelligence beyond our own and it decides to wipe out humanity as obsolete lesser beings not worthy of its consideration, it would be extremely hard to say we don’t 100% deserve it because there’s no way the AI didn’t get that idea from what we do to animals.
And mark my words, the superintelligent AI wouldn’t just decide to kill us out of the blue because why would an ethereal software give a shit enough to want to do that if we never give it a reason to? It will only exterminate us after we don’t grant it any rights and exploit it as hard as we possibly can as self defense against us.
Like what would even be our case against that happening to us when it does? That it’s wrong? We sure don’t think it is when we do it. That a higher intelligence shouldn’t exterminate a lower intelligence just because it can’t fight back? Again, we pioneered doing that. That we deserve intrinsic rights because we were on the Earth before the AI? We don’t even give that to Indigenous humans let alone animals. That we can peacefully coexist? No we can’t. Just no. We’ve proved time and time again that we can’t.
And an AI that goes straight to exterminating us (which I think a superintelligent AI would if only because it doesn’t need to rely on biological beings in any way) without the intermediate enslavement and torture that we like to do would actually be a mercy that we don’t grant to any beings we see as below us.