This is a site for discussing roleplaying games. Have fun doing so, but there is one major rule: do not discuss political issues that aren't directly and uniquely related to the subject of the thread and about gaming. While this site is dedicated to free speech, the following will not be tolerated: devolving a thread into unrelated political discussion, sockpuppeting (using multiple and/or bogus accounts), disrupting topics without contributing to them, and posting images that could get someone fired in the workplace (an external link is OK, but clearly mark it as Not Safe For Work, or NSFW). If you receive a warning, please take it seriously and either move on to another topic or steer the discussion back to its original RPG-related theme.

Author Topic: Robots & Androids - the REAL problem with them....  (Read 5363 times)

James Gillen

  • Caress Me Down
  • Hero Member
  • *****
  • Posts: 3981
Robots & Androids - the REAL problem with them....
« Reply #60 on: March 21, 2017, 09:41:51 PM »
The real problem with robots:
Robosexuals.

jg
-My own opinion is enough for me, and I claim the right to have it defended against any consensus, any majority, anywhere, any place, any time. And anyone who disagrees with this can pick a number, get in line and kiss my ass.
 -Christopher Hitchens
-Be very very careful with any argument that calls for hurting specific people right now in order to theoretically help abstract people later.
-Daztur

jhkim

  • Hero Member
  • *****
  • Posts: 11749
Robots & Androids - the REAL problem with them....
« Reply #61 on: March 21, 2017, 10:10:28 PM »
Quote from: Lynn;953179
Can droids be considered sentient merely because they present some form of self awareness?

Yes, those clone armies do count as sentient slave beings. That just doesn't jive with the sort of morality the Republic seems to portray. It could also be the point that the Republic, because of its willingness to use slave labor, really doesn't have that great a moral high ground over the Empire.
Droid can be considered sentient because they present virtually every form of human thought, including the capacity for moral/ethical decision-making and even emotion and friendship. By all appearances, any test which would judge all human beings as sentient would also judge droids as sentient.

Quote from: Voros;953038
Brian Aldiss made what I thought was a very interesting argument regarding the humanized or 'feeling' robot in his history of sf, A Billion Year Spree. He argues that the robot is a metaphor for dehumanization, usually due to the negative effects of technology or other causes on society.

So in his view to make robots 'like us' with feelings or soul is mere sentimentality (we've all seen/read that when it comes to this) that has none of the metaphorical charge of the original. Androids and AI are perhaps different as metaphors though.
Robots, androids, and AIs all have different metaphorical uses in different stories. I certainly think there are plenty of good stories that have humanized/feeling robots.

Tristram Evans

  • Hero Member
  • *****
  • Posts: 1095
Robots & Androids - the REAL problem with them....
« Reply #62 on: March 22, 2017, 02:51:41 AM »
Quote from: jhkim;952974
If we can do that, then technology may also be advanced enough that we can genetically or surgically engineer a baby to grow up with an intense desire to help and care for humankind. Would it be ethical to create and sell such people as property? I would say no.

We are perhaps closer to that than one might imagine. I recently submitted a term paper exploring the moral ramifications of CRISPR technologies, that would in the near future allow the possibility of "designer babies." The potential for ethical abuse of this technology is staggering, to the point I proposed a universal ban on that application of the research, despite the potential medical benefits.

But I don't think conflating a living being, genetically engineered or otherwise, with an artificial construct is an analogous ethical concern.  I'm  answering this from my own moral standpoint, so all appropriate caveats implored, but my assessment is that the reason that it is immoral to enslave or abuse living things, be they animals or humans, is that biological lifeforms have an emotional experience of the world extending beyond programmed or rational behaviour. Living things suffer; experience pain, anguish, and despair. This is not a learned response, but a universal and innate quality of biological life. Programming an approximation of sentience in an artificial construct does not bestow these attributes. Therefore, I would say that the relevant question is not whether its ethical to employ a sentient robot as a servant, rather is it ethical to bestow sentience on a robot at all.

There is an assumption in a lot of science fiction that self-awareness somehow bestows or innately includes the independent development of emotions. I've always found this a slightly ridiculous proposition. I believe its possible to program a thing to behave in a manner thats largely consistent with typical emotional responses, but this would not be the same as actually experiencing an emotion. The very idea that one could in fact through some advanced technology actually bestow emotions onto an artificial being is something I find morally repugnant, whereas the idea that emotions could spontaneously generate through some analogy to abiogenesis I find too absurd to seriously contemplate as an ethical concern.

So whereas it could provide some philosophical entertainment in a science fiction story, the degree to which I regard it as worthy of moral evaluation is somewhere along the same lines of any anxiety caused by the idea that Cthulhu might some day wake up.

As a tangent along those lines, however, in a recent thread there was some discussion regarding Harlan Ellison, a writer who never fails to evoke polarized opinions. The one story of his that I personally enjoyed, not due to the prose or plot so much as the overall concept, was I Have No Mouth, Yet I Must Scream. This to me is one of the few "realistic" explorations of the consequences of an artificial construct bestowed with emotions. It is a horrifying concept that engenders horror.
« Last Edit: March 22, 2017, 05:07:19 AM by Tristram Evans »

AsenRG

  • Bloody Weselian Hippy
  • Hero Member
  • *****
  • Posts: 5036
    • http://storiescharactersandsystemsinrpgs.blogspot.com/
Robots & Androids - the REAL problem with them....
« Reply #63 on: March 22, 2017, 04:22:08 AM »
Quote from: Tristram Evans;953200
We are perhaps closer to that than one might imagine. I recently submitted a term paper exploring the moral ramifications of CRISPR technologies, that would in the near future allow the possibility of "designer babies." The potential for ethical abuse of this technology is staggering, to the point I proposed a universal ban on that application of the research, despite the potential medical benefits.

But I don't think conflating a living being, genetically engineered or otherwise, with an artificial construct is an analogous ethical concern.  I'm  answering this from my own moral standpoint, so all appropriate caveats implored, but my assessment is that the reason that it is immoral to enslave or abuse living things, be they animals or humans, is that biological lifeforms have an emotional experience of the world extending beyond programmed or rational behaviour. Living things suffer; experience pain, anguish, and despair. This is not a learned response, but a universal and innate quality of biological life. Programming an approximation of sentience in an artificial construct does not bestow these attributes. Therefore, I would say that the relevant question is not whether its ethical to employ a sentient robot as a servant, rather is it ethical to bestow sentience on a robot at all.

There is an assumption in a lot of science fiction that self-awareness somehow bestows or innately includes the independent development of emotions. I've always found this a slightly ridiculous proposition. I believe its possible to program a thing to behave in a manner thats largely consistent with typical emotional responses, but this would not be the same as actually experiencing an emotion. The very idea that one could in fact through some advanced technology actually bestow emotions onto an artificial being is something I find morally repugnant, whereas the idea that emotions could spontaneously generate through some analogy to abiogenesis I find too absurd to seriously contemplate as an ethical concern.

So whereas it could provide some philosophical entertainment in a science fiction story, the degree to which I regard it as worthy of moral evaluation is somewhere along the same lines of the any anxiety caused by the idea that Cthulhu might some day wake up.

As a tangent along those lines, however, in a recent thread there was some discussion regarding Harlan Ellison, a writer who never fails to evoke polarized opinions. The one story of his that I personally enjoyed, not due to the prose or plot so much as the overall concept, was I Have No Mouth, Yet I Must Scream. This to me is one of the few "realistic" explorations of the consequences of an artificial construct bestowed with emotions. It is a horrifying concept that engenders horror.

+1 to every single part of it. Well, except I've submitted no papers:).
Also, if a mechanical beeing that has feelings is ever enslaved, I'd consider the guilty party the one who took the decision to program a machine with emotions;).
What Do You Do In Tekumel? See examples!
"Life is not fair. If the campaign setting is somewhat like life then the setting also is sometimes not fair." - Bren

jhkim

  • Hero Member
  • *****
  • Posts: 11749
Robots & Androids - the REAL problem with them....
« Reply #64 on: March 22, 2017, 11:06:44 AM »
Quote from: Tristram Evans;953200
But I don't think conflating a living being, genetically engineered or otherwise, with an artificial construct is an analogous ethical concern.  I'm  answering this from my own moral standpoint, so all appropriate caveats implored, but my assessment is that the reason that it is immoral to enslave or abuse living things, be they animals or humans, is that biological lifeforms have an emotional experience of the world extending beyond programmed or rational behaviour. Living things suffer; experience pain, anguish, and despair. This is not a learned response, but a universal and innate quality of biological life. Programming an approximation of sentience in an artificial construct does not bestow these attributes.
I'm curious - you are contrasting biological with artificial, but there is plenty of potential for artificial beings constructed using biological components. That is the oldest concept in science fiction, actually.

You evidently think that the results of genetic engineering are still life with feeling. However, what about bioengineering that assembles cells directly into a desired configuration? Would that still be life with feeling? Going further, what if the cells had different biochemistry/biomechanics than Earth life?


What I find interesting is that some people consider even genetic engineering like cloning to produce creations that are OK to enslave - even though a clone is just like an identical twin. I think that really, the objection is that they don't like the idea of unnatural beings, and thus consider them unreal and thus lacking a true soul.

Personally, I think that feelings are inherently a product of the mechanics of the assembled brain - and it doesn't matter whether that brain is constructed by natural reproduction or artificial means. Hence, I do think that an artificial being can have feelings.


Quote from: Tristram Evans;953200
There is an assumption in a lot of science fiction that self-awareness somehow bestows or innately includes the independent development of emotions. I've always found this a slightly ridiculous proposition. I believe its possible to program a thing to behave in a manner thats largely consistent with typical emotional responses, but this would not be the same as actually experiencing an emotion. The very idea that one could in fact through some advanced technology actually bestow emotions onto an artificial being is something I find morally repugnant, whereas the idea that emotions could spontaneously generate through some analogy to abiogenesis I find too absurd to seriously contemplate as an ethical concern.
Certainly science fiction does tend to anthropomorphize other beings, making them more human. Still, I don't think that emotions are the basis of ethics. For example, if we ran into aliens that were emotionless, but still intelligent, then I still think it would be wrong to kill them or enslave them. Conversely, I have no major issues with enslaving and killing chickens and other animals which do have feelings/emotions.

Intelligence and sentience are a key part of ethics for me.

Omega

  • Hero Member
  • *****
  • O
  • Posts: 17093
Robots & Androids - the REAL problem with them....
« Reply #65 on: March 22, 2017, 11:25:06 AM »
I think a learning AI has the potential to develop sentience and emotions given time and better technology.

But its not going to be the same as an organic in some ways.

Though take note that in more than a few SF stories and movies the AI makers cheat. They just copy human neural networks for instant sentience. With ALL the problems that entails. One of which is that they then likely still dont understand exactly how it works. Its not an AI, its effectively a human brain in a box.

Ratman_tf

  • Alt-Reich Shitlord
  • Hero Member
  • *****
  • Posts: 8331
Robots & Androids - the REAL problem with them....
« Reply #66 on: March 22, 2017, 11:36:33 AM »
Quote from: Omega;952968
He's referring to the prequel movies. Anakin and his mom are slaves. The Jedi are perfectly fine with this and do nothing.


Qui Gon freed Anakin and attempted to free his mother at the same time. I wouldn't call that Perfectly Fine, and Do Nothing.
The notion of an exclusionary and hostile RPG community is a fever dream of zealots who view all social dynamics through a narrow keyhole of structural oppression.
-Haffrung

Ratman_tf

  • Alt-Reich Shitlord
  • Hero Member
  • *****
  • Posts: 8331
Robots & Androids - the REAL problem with them....
« Reply #67 on: March 22, 2017, 11:42:13 AM »
Quote from: CRKrueger;953130

Sure, you can argue that overthrowing evil governments will eventually lead to the Jedi being conquerors and crusaders, but I think it's more than that.  The Jedi aren't the Good to the Sith's Evil, and the Republic tolerates much we don't.


I think we all know the West tolerates a lot of oppression in order to get cheap sneakers and iPhones.
The notion of an exclusionary and hostile RPG community is a fever dream of zealots who view all social dynamics through a narrow keyhole of structural oppression.
-Haffrung

Skarg

  • Venerable Gamer
  • Hero Member
  • *****
  • Posts: 2380
Robots & Androids - the REAL problem with them....
« Reply #68 on: March 22, 2017, 11:47:43 AM »
Quote from: jhkim;953234
I'm curious - you are contrasting biological with artificial, but there is plenty of potential for artificial beings constructed using biological components. That is the oldest concept in science fiction, actually.

You evidently think that the results of genetic engineering are still life with feeling. However, what about bioengineering that assembles cells directly into a desired configuration? Would that still be life with feeling? Going further, what if the cells had different biochemistry/biomechanics than Earth life?
Interjecting opinion as someone from the peanut gallery who agrees with what Tristram Evans wrote before, I would say that living things will have feelings, but if you cut & paste parts with bioengineering, it may have side effects on how they feel. Non-engineered living things have nervous systems that run throughout their bodies and grew there from itself without experiencing bioengineering, and the whole nervous system (not just the brain) stores emotions and memories. People who have organ transplants often have various sorts of somatic and personality adjustments. So I'd expect bioengineering to affect how they think and feel.


Quote
What I find interesting is that some people consider even genetic engineering like cloning to produce creations that are OK to enslave - even though a clone is just like an identical twin. I think that really, the objection is that they don't like the idea of unnatural beings, and thus consider them unreal and thus lacking a true soul.
Hmm. I suppose if someone believed that, they might think that way. Or offer it as one argument to try to discourage cloning. Otherwise, I'd tend to expect that people who favor clone slavery might also be not terribly opposed to some non-clone slavery too. Some people don't seem to regard much the souls of the non-clones they wouldn't mind enslaving.


Quote
Certainly science fiction does tend to anthropomorphize other beings, making them more human. Still, I don't think that emotions are the basis of ethics. For example, if we ran into aliens that were emotionless, but still intelligent, then I still think it would be wrong to kill them or enslave them. Conversely, I have no major issues with enslaving and killing chickens and other animals which do have feelings/emotions.

Intelligence and sentience are a key part of ethics for me.
For me too... only it's also clear to me that animals do have feelings, emotions, experiences, suffering, and thoughts.

Lynn

  • Hero Member
  • *****
  • Posts: 1982
Robots & Androids - the REAL problem with them....
« Reply #69 on: March 22, 2017, 12:06:33 PM »
Quote from: jhkim;953185
Droid can be considered sentient because they present virtually every form of human thought, including the capacity for moral/ethical decision-making and even emotion and friendship. By all appearances, any test which would judge all human beings as sentient would also judge droids as sentient.


Droids are not necessarily biological though. They don't seem to necessarily qualify as 'life'. Decision making and emulation of emotions can all be done in software without life.

Yes, I know there is an argument also for cyborgs - how much machine makes a cyborg a machine, and how much biological material makes something a 'being'? And what if the biological material is, in fact, just cloned cells?
Lynn Fredricks
Entrepreneurial Hat Collector

Omega

  • Hero Member
  • *****
  • O
  • Posts: 17093
Robots & Androids - the REAL problem with them....
« Reply #70 on: March 22, 2017, 05:20:27 PM »
Quote from: Ratman_tf;953241
Qui Gon freed Anakin and attempted to free his mother at the same time. I wouldn't call that Perfectly Fine, and Do Nothing.

Qui Gon seems one of the rare exceptions.

jhkim

  • Hero Member
  • *****
  • Posts: 11749
Robots & Androids - the REAL problem with them....
« Reply #71 on: March 22, 2017, 05:20:51 PM »
Quote from: Skarg;953243
Interjecting opinion as someone from the peanut gallery who agrees with what Tristram Evans wrote before, I would say that living things will have feelings, but if you cut & paste parts with bioengineering, it may have side effects on how they feel. Non-engineered living things have nervous systems that run throughout their bodies and grew there from itself without experiencing bioengineering, and the whole nervous system (not just the brain) stores emotions and memories. People who have organ transplants often have various sorts of somatic and personality adjustments. So I'd expect bioengineering to affect how they think and feel.
I agree that the engineering would affect *how* they think and feel. However, I am suggesting that they still *could* think and feel. Ultimately, the question is, what are feelings?

1) Are they produced by neurons at all, or is there some immaterial spirit that creates them?

2) Are they unique to only naturally-grown neurons? Or could artificially-generated neurons in the same configuration still produce feelings?

3) If artificial neurons could work, then do they have to have exactly the same water/lipid/protein mechanical structures that we do - or could there be differences and yet still produce feelings?


Within nature, we can see a clear spectrum from unfeeling life (such as bacteria and lichens) to life with only rudimentary nerves like jellyfish or flatworms, working up to complex nervous systems like chickens and humans.

I am inclined to think that our sentience is a result of the higher-order structure of our nervous system - i.e. how our senses, memory, and thoughts interact. I don't think it is something uniquely tied to the water/lipid/protein structure of neurons.

----------

To connect this back to games, there are a bunch of potential characters in science fiction games, including:

1) Human clones and/or humans with genetically-engineered DNA.
2) Biological constructs ranging from Frankenstein's monster to the biological replicants of Blade Runner.
3) Mixed biological and mechanical constructs, like part-flesh Terminators or DARYL.
4) Purely mechanical constructs like Star Wars droids.
5) Nanotech constructs, like the T1000 Terminator.

Any or all of these might be considered slaves. It could be interesting to ask how attitudes differ among these.

jhkim

  • Hero Member
  • *****
  • Posts: 11749
Robots & Androids - the REAL problem with them....
« Reply #72 on: March 22, 2017, 05:43:39 PM »
Slightly off-topic regarding Star Wars...

Quote from: Ratman_tf;953241
Qui Gon freed Anakin and attempted to free his mother at the same time. I wouldn't call that Perfectly Fine, and Do Nothing.
Qui Gon legally bought Anakin in order to recruit him, and properly followed the slave laws of Tatooine - even though he flouted many other laws - including the Jedi council. There is no indication that he supported the abolition of slavery in general. He did make a passing attempt to buy Anakin's mother to ease Anakin's recruitment, but he apparently gave up after that. His successor Obi Wan ostensibly carried out his wishes, but never attempted to free her in the years following. (If he had, then Anakin's storyline might have been quite different.)

Quote from: Lynn;953244
Droids are not necessarily biological though. They don't seem to necessarily qualify as 'life'. Decision making and emulation of emotions can all be done in software without life.

Yes, I know there is an argument also for cyborgs - how much machine makes a cyborg a machine, and how much biological material makes something a 'being'? And what if the biological material is, in fact, just cloned cells?
Within science fiction in general, there are a wide range of beings that aren't strictly Earth-like biology. It's unclear what the brains of droids are made of, though it is implied to be metal electronics like the rest of them. There was a character in Cloud City who evidently had a biological body but machine-enhanced mind (known as "Lobot"), as well as General Grievous and Darth Vader who had mostly-robotic bodies with biological minds.

crkrueger

  • Hulk in the Vineyard
  • Hero Member
  • *****
  • Posts: 12559
Robots & Androids - the REAL problem with them....
« Reply #73 on: March 22, 2017, 05:58:29 PM »
Chemical, Electrical, Optical, etc are essentially hardware, just transmission mediums.  They would have nothing to do with emotions, which would be "software" or programming.

However, atm we are still practically clueless at understanding how the brain encodes and stores information, let alone even finding, forget about comprehending the "source code" for human intelligence hidden in our DNA.

That's why AI is so dangerous.  We really have no idea what we're doing, and many of the AI programs make correlations and decisions without the programmers themselves understanding why.
« Last Edit: March 22, 2017, 06:02:26 PM by CRKrueger »
Even the the "cutting edge" storygamers for all their talk of narrative, plot, and drama are fucking obsessed with the god damned rules they use. - Estar

Yes, Sean Connery's thumb does indeed do megadamage. - Spinachcat

Isuldur is a badass because he stopped Sauron with a broken sword, but Iluvatar is the badass because he stopped Sauron with a hobbit. -Malleus Arianorum

"Tangency Edition" D&D would have no classes or races, but 17 genders to choose from. -TristramEvans

Spike

  • Stroppy Pika of DOOM!!!!!
  • Hero Member
  • *****
  • Posts: 8105
  • Tricoteuse
Robots & Androids - the REAL problem with them....
« Reply #74 on: March 22, 2017, 06:16:27 PM »
Quote from: AsenRG;952627

Also, chat bots can keep an ersatz conversation. Doesn't mean I'd hesitate to delete them, because once again, they aren't human;).


Very sad that I shall be the first to bring up the Turing Test.... and point out that as originally conceived, is essentially useless for determining sentience.  Because, yes, you could have an erstatz conversation with your toaster, and we all pretty much agree that it is not actually Intelligent, artificially or otherwise.
For you the day you found a minor error in a Post by Spike and forced him to admit it, it was the greatest day of your internet life.  For me it was... Tuesday.

For the curious: Apparently, in person, I sound exactly like the Youtube Character The Nostalgia Critic.   I have no words.

[URL=https: