forum

Is it ok to torture machines?

posted
Total Posts
30
Topic Starter
abraker
OT misses my philosophical questions I know it <3

So suppose you have an android that behaves as human is it can possibly be. It laughs, it hugs, it plays, it cuddles with you, help you with the little things in life like your homework, basically what a friend or an SO would do. It looks as human as it can be. It blinks, can cry, is warm, it sleeps with you, eats, drinks, goes to the bathroom, etc. You can love it and it can love you back. You can depend on it and it won't let you down. Your relationship can be deeper than mutual.

It also has it own robot friends. It would go out hang out with other androids from time to time, doing android things. It also works, has a job, and functions like a human would in everyday society. It works in an elementary school as a teacher and the kids learn much better from it then they would from a human teacher. It earns money, spends money, buys you gifts, you buy it gifts.

However it has all human traits but one. It does not have a conscience. It behaves like does have one, but it's proven not to have one. As a result it's just a machine with skin and organs like human and programming that tricks you to believe it has feelings. You know it is practically immortal other than the possibility of encountering a software error.

There are also no laws regarding android rights. No laws that make illegal to "kill" an android either. Androids are purposefully made such that their programming doesn't allow them to rebel. More desperate people are known to dismantle androids who aimlessly enter shady areas for parts to sell in the scrapyard, maybe even scrap some gold from the electronics.

So one day a friend comes up to you and offers you a bargain, once in a life time opportunity. This friend says that he will give you 100 million dollars, which is 50 times more than what the android cost you, if you give him a video of the android tied up, beaten and tortured to the point a normal human would plead to just kill them. Why can't he get an android and do it himself if he is so rich then? You are wondering the same thing, but with this money you can buy even more these androids, so why not, right?

The answer of whether you would do it may come down to how emotionally attached you are to this particular machine, but is it moral to do so given all this?
levesterz
kai99
its ok until the robots lose their shit and let out revenge onpeopel
levesterz
Isnt yhat there three rule for robot
One doesnot disobey its master as long its proper attique
One shall not harm any human
If both rules disobeyed or any mal function happen self shutting down

Idk
Comfy Slippers
So basically, you're talking about a normal human that can't reproduce. But then again you would make a robot that makes a robot which is reproduction in a nutshell. There would be no human need for physically active jobs either which is slowly becoming a bigger issue even w/o this. Imo, discussing on the topic(s) of "how we cope with this" or "how we alter our educational system" would be more interesting.

So at the end, we are just making a "perfect human" and you don't really have a need to draw a line between a human and AI. We are "its gods", and through history we ourselves have defined god(s) as merciful deities. So yea, it'd be highly immoral.
Yukitemi
Why is it bad to torture a humans?...
Did I said that out loud?
schism
W-W-W-WHY DO YOU ASK THIS QUESTION????


D-don't tell me..... this is another form of robot discrimination.....
click
This question has been recorded with my automatic recording function to sue you later......
Ephemeral

abraker wrote:

OT misses my philosophical questions I know it <3

The answer of whether you would do it may come down to how emotionally attached you are to this particular machine, but is it moral to do so given all this?
ok so your premise is:

robot that exemplifies a normal human in the following ways:
anthropomorphized expressions of 'emotion'
noted relationships with other robot and human entities
mimicry of human functioning in society
does not have a 'proven' conscience

the latter point and the lack of laws in this fictional universe to safeguard android/robot rights is the chief premise of your justification for the moral ambiguity here as far as i can tell

the biggest question to ask here is: how do you prove something has a conscience? and then after that is, does the lack of a conscience thereof permit moral disregard for the safety and health of that entity?
Meah
The true magic results from courage of the heart.
Boys and girls be ambitious.
One step can change the world.
Topic Starter
abraker

Ephemeral wrote:

the biggest question to ask here is: how do you prove something has a conscience?
Because such premise is hard to satisfy otherwise, I have stated that in this fictional reality it has been proven that AI has no conscience.

Comfy Slippers wrote:

So at the end, we are just making a "perfect human" and you don't really have a need to draw a line between a human and AI. We are "its gods", and through history we ourselves have defined god(s) as merciful deities. So yea, it'd be highly immoral.
I think that is tied too closely to an individual's belief and not the entire society's view on it.

My discussion about this on discord concludes that is indeed immoral, but not for any of the reasons stated here. Simply because it is counter productive to desrupt something that is of benefit to us. Since the AI has integrated or starting to integrate into society with positive results, it is not in our favor to destroy them. Indeed a rogue AI is another story or an AI the does nothing to benefit all, but given the premise this seems to be the logical conclusion.
kai99
to be frank about this i'd say it is okay, because AI is only a replication of conscience and really just a program dedicated to looking as if a machine has a human's traits. it doesn't actually feel anything and thus isn't inflicting harm on a living being. as you've said it just boils down to the problem with the mutual emotional bond between each other
in terms of morals it would be more of a problem with your behavior i mean if you could do that to what looks like a human being there's no assurance that you won't beat up another human either, just for money

that being said, i've also had a similar topic being covered in a webcomic. in the fictional world where people who abandon animals have to pay a large fine, people have created robots that look and act just like pets but abandoning these robotic-pets are legal, so many people are giving up the robots for disposal. would this be morally acceptable... well i'd say it is
Comfy Slippers
I think that is tied too closely to an individual's belief and not the entire society's view on it.
I'm not looking at it from religious standpoint, but rather how we shaped society and ethics as a result of religion. You can argue about the religion as a whole and the existence of this supreme being(s) but you can't neglect the fact that it helped shape up societies and bring people toward a certain goal.

I probably shoulda phrased it differently, hope this made it a bit more clear. :^)
B1rd

abraker wrote:

My discussion about this on discord concludes that is indeed immoral, but not for any of the reasons stated here. Simply because it is counter productive to desrupt something that is of benefit to us. Since the AI has integrated or starting to integrate into society with positive results, it is not in our favor to destroy them. Indeed a rogue AI is another story or an AI the does nothing to benefit all, but given the premise this seems to be the logical conclusion.
How is it immoral? A machine is not a living being so it is not immoral to "torture" it just because it has a resemblance. However you can't say that destroying something is "not in our favour", because you can't say that the value of an object's destruction wouldn't bring about more value (through entertainment) than it's continued existence. If you were to say that it were immoral, to be consistent, you would also have to say that videos on Youtube of people destroying stuff is also immoral. If someone is willing to pay $100 million for something's destruction, then that's pretty clear that that person values the destruction more than the rest of society values its existence. And in the first place, it's private property, for one to do with as one pleases.
Railey2

Ephemeral wrote:

abraker wrote:

OT misses my philosophical questions I know it <3

The answer of whether you would do it may come down to how emotionally attached you are to this particular machine, but is it moral to do so given all this?
ok so your premise is:

robot that exemplifies a normal human in the following ways:
anthropomorphized expressions of 'emotion'
noted relationships with other robot and human entities
mimicry of human functioning in society
does not have a 'proven' conscience

the latter point and the lack of laws in this fictional universe to safeguard android/robot rights is the chief premise of your justification for the moral ambiguity here as far as i can tell

the biggest question to ask here is: how do you prove something has a conscience? and then after that is, does the lack of a conscience thereof permit moral disregard for the safety and health of that entity?
it doesn't matter how you prove if something has conscience, because the premise already defined the robot to not have conscience. That makes your first question completely irrelevant to the scenario at hand.

The better question would be:
a) What is conscience
b) (as you correctly pointed out) what is its bearing on the moral question of whether or not it is ok to administer torture?

These two questions probably can't be answered in any meaningful way, though.


abraker wrote:

My discussion about this on discord concludes that is indeed immoral, but not for any of the reasons stated here. Simply because it is counter productive to desrupt something that is of benefit to us. Since the AI has integrated or starting to integrate into society with positive results, it is not in our favor to destroy them.
that's a clever way to get around the problem of conscience.

i can agree to this.
indeed it wouldn't be in our best interest to make all these elementary school students cry because their beloved teacher got totured.
Topic Starter
abraker

B1rd wrote:

If you were to say that it were immoral, to be consistent, you would also have to say that videos on Youtube of people destroying stuff is also immoral. If someone is willing to pay $100 million for something's destruction, then that's pretty clear that that person values the destruction more than the rest of society values its existence. And in the first place, it's private property, for one to do with as one pleases.
Putting the worth of destruction higher than what society values its existence as is an interesting concept. Logically sound, perhaps morally not. This a can of worms I think I'll open up a bit later, so I'll hold off on arguing against this for now.

B1rd wrote:

However you can't say that destroying something is "not in our favour", because you can't say that the value of an object's destruction wouldn't bring about more value (through entertainment) than it's continued existence.
It will satisfy a smaller group of people because the destruction is a one time event. Its continued existence will give it the opportunity to satisfy a larger amount of people through time.

In words you might understand, its destruction will result in a one time large payment rather than a continual payment of smaller values throughout a big enough time period such that the return value is greater than that one time large payment.
B1rd

abraker wrote:

Putting the worth of destruction higher than what society values its existence as is an interesting concept. Logically sound, perhaps morally not. This a can of worms I think I'll open up a bit later, so I'll hold off on arguing against this for now.
It will satisfy a smaller group of people because the destruction is a one time event. Its continued existence will give it the opportunity to satisfy a larger amount of people through time.

In words you might understand, its destruction will result in a one time large payment rather than a continual payment of smaller values throughout a big enough time period such that the return value is greater than that one time large payment.
How can you possibly say for certain that the net satisfaction derived from its destruction will be lower than it's continued existence? How you can you make any sweeping statements about morality based off of arbitrary and unprovable reasoning about the net satisfaction derived? As I have stated earlier, there is no fundamental difference between this and people destroying things on Youtube with a hydraulic press. There is no fundamental difference between this and the video game industry. You could argue that an enormous amount of resources, labour and human capital is being wasted on the development of games, and theorise that the satisfaction derived from the games produced is less than the resources used to make them. But how do we know that this likely isn't the case? Basic principles of economics. The consumer, acting in their own self interest, would not purchase games if they did not believe that the satisfaction they would derive from them was worth the money spent. Because value is subjective, we cannot determine whether any action or transaction is wasteful unless we have extensive information about the parties' minds and motives. But we can assume that generally, because people act in their self-interest, that people do not waste resources.

Even if we can pinpoint a specific instance were someone is being wasteful or not benefiting society, can we say that is "immoral"? No one has any obligation to use their private property to benefit the rest of society. Morality is a very ambiguous and subjective term, so we can't really make any objective conclusions sbout this being "immoral". However what I can say, if someone wasn't harming anyone else, it definitely would be immoral to initiate force upon that person because it was perceived that they weren't benefiting society with their own property as much as they could have.
Railey2
phones don't teach in elementary schools and don't have human friends, so there is a fundamental difference here.
Read the premise carefully.

Do you think that the concept of private property overrides all other moral considerations?
kai99

B1rd wrote:

However what I can say, if someone wasn't harming anyone else, it definitely would be immoral to initiate force upon that person because it was perceived that they weren't benefiting society with their own property as much as they could have.
^

well i guess it comes down to what criteria you would evaluate the person's actions with.
johnmedina999

B1rd wrote:

SPOILER
How can you possibly say for certain that the net satisfaction derived from its destruction will be lower than it's continued existence? How you can you make any sweeping statements about morality based off of arbitrary and unprovable reasoning about the net satisfaction derived? As I have stated earlier, there is no fundamental difference between this and people destroying things on Youtube with a hydraulic press. There is no fundamental difference between this and the video game industry. You could argue that an enormous amount of resources, labour and human capital is being wasted on the development of games, and theorise that the satisfaction derived from the games produced is less than the resources used to make them. But how do we know that this likely isn't the case? Basic principles of economics. The consumer, acting in their own self interest, would not purchase games if they did not believe that the satisfaction they would derive from them was worth the money spent. Because value is subjective, we cannot determine whether any action or transaction is wasteful unless we have extensive information about the parties' minds and motives. But we can assume that generally, because people act in their self-interest, that people do not waste resources.

Even if we can pinpoint a specific instance were someone is being wasteful or not benefiting society, can we say that is "immoral"? No one has any obligation to use their private property to benefit the rest of society. Morality is a very ambiguous and subjective term, so we can't really make any objective conclusions sbout this being "immoral". However what I can say, if someone wasn't harming anyone else, it definitely would be immoral to initiate force upon that person because it was perceived that they weren't benefiting society with their own property as much as they could have.
What's the difference between destroying an iPhone on a hydraulic press on YouTube, and destroying a person on a hydraulic press on YouTube?
As society started to rise and develop, we needed to create some rules to keep it running smoothly, and some consequences for breaking those rules. Thus, we have been taught since the beginning of civilization that killing people is bad, and we shouldn't do it. If a robot fitting abraker's description was widespread, societal norms would evolve to add mistreating them to the big list of things we shouldn't do, and eventually, over the course of dozens of generations, humans would have the same outlook towards torturing robots as we see torturing humans.
Topic Starter
abraker

B1rd wrote:

How can you possibly say for certain that the net satisfaction derived from its destruction will be lower than it's continued existence?
Going completely logical here, AI would outlast any human being and would last indefinitely if maintained. Unless the net satisfaction derived from its destruction is infinite, the AI would bring infinite net satisfaction by bringing X satisfaction across infinite time.

B1rd wrote:

How you can you make any sweeping statements about morality based off of arbitrary and unprovable reasoning about the net satisfaction derived? As I have stated earlier, there is no fundamental difference between this and people destroying things on Youtube with a hydraulic press.
What Railey2 said. AI being destroyed would dissatisfy its friends, the children it teaches, and the department it works for and would satisfy only the person that would watch the video of it being destroyed. The things people destroying things on Youtube would dissatisfy almost nobody and satisfy the many people who are subscribed to the channel.

B1rd wrote:

There is no fundamental difference between this and the video game industry. You could argue that an enormous amount of resources, labour and human capital is being wasted on the development of games, and theorise that the satisfaction derived from the games produced is less than the resources used to make them. But how do we know that this likely isn't the case?
This is what I call a case of "shooting yourself in the foot". Unlike the destruction of AI which goes against the benefit of society, the waste of resources of the game would go against the benefit of the company wasting the resources. A decision to be of no benefit to itself. Let the company waste their own money if they want since their money is their property and they can do what they want with their property. At least some people got satisfaction from it.

B1rd wrote:

Even if we can pinpoint a specific instance were someone is being wasteful or not benefiting society, can we say that is "immoral"? No one has any obligation to use their private property to benefit the rest of society. Morality is a very ambiguous and subjective term, so we can't really make any objective conclusions sbout this being "immoral".
Can we say that one has the obligation not to use their private property to damage the rest of society?

B1rd wrote:

However what I can say, if someone wasn't harming anyone else, it definitely would be immoral to initiate force upon that person because it was perceived that they weren't benefiting society with their own property as much as they could have.
This falls under the same category as the other argument I have decided to not go into in my previous reply. I'll leave this one open for later.
show more
Please sign in to reply.

New reply