Random
Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

If you create a true artificial intelligence, are you morally justified in doing whatever you want to that creation? Can you hurt them if you want?


Imagine a hypothetical AI being of the kind we see in science fiction; truly real persons that are artificially created.
Is it a moral action for you to cause that being to suffer if they do not meet your standards? Are you morally right to do that? They are utterly your creation, they do not exist without your act of creation. Does that mean you can torture them or abuse them or subjugate them and still be morally justified?

That is the argument that theists use for god having the right to inflict suffering on humanity.
Is it still a satisfying argument when we remove the conceits we allow for god?
This page is a permanent link to the reply below and its nested replies. See all post replies »
DDonde · 31-35, M
This doesn't answer your question, but why make one at all? If you can achieve the same thing without creating personhood?
I can't think of any reason to create an artificially sentient being without it being because of some kind of god complex someone has. It's not useful.

To answer your question: If it has personhood, then treat it like a person. If it doesn't have personhood, then it doesn't matter I guess. Although I guess I've just pushed the answer onto the definition of personhood instead of actually answering the question.
@DDonde

I dunno. Why do people climb mountains? lol Stupid monkeys just do stuff because we can.

If it has personhood, then treat it like a person.

I think that's appropriate.
So you would say that as the creator you still don't have a right to do whatever you want to the person you created?
DDonde · 31-35, M
@Pikachu Correct, I don't think being a creator absolves you of moral responsibility
@DDonde

Then we are in agreement