SPECIAL NOTICE
Malicious code was found on the site, which has been removed, but would have been able to access files and the database, revealing email addresses, posts, and encoded passwords (which would need to be decoded). However, there is no direct evidence that any such activity occurred. REGARDLESS, BE SURE TO CHANGE YOUR PASSWORDS. And as is good practice, remember to never use the same password on more than one site. While performing housekeeping, we also decided to upgrade the forums.
This is a site for discussing roleplaying games. Have fun doing so, but there is one major rule: do not discuss political issues that aren't directly and uniquely related to the subject of the thread and about gaming. While this site is dedicated to free speech, the following will not be tolerated: devolving a thread into unrelated political discussion, sockpuppeting (using multiple and/or bogus accounts), disrupting topics without contributing to them, and posting images that could get someone fired in the workplace (an external link is OK, but clearly mark it as Not Safe For Work, or NSFW). If you receive a warning, please take it seriously and either move on to another topic or steer the discussion back to its original RPG-related theme.

Robots & Androids - the REAL problem with them....

Started by Koltar, August 28, 2009, 03:31:28 PM

Previous topic - Next topic

jibbajibba

Quote from: pseudointellectual;326456Well it seems like all those other things you mentioned run counter to Humanity. So yeah you can upload your AI to a new body after you get shot in the chest but you lose Humanity as a result, or you can struggle with the "pain" like humans do.

Yeah but as I say unless getting to be human is a 'win' or at lease a game objective you have to say so "I am not Human I am a robot".
You might have roleplay type penalties but then what do you do with the player who rejects the pathetic cry for humanity from his bretheren. Remember Call Me Kenneth ....
No longer living in Singapore
Method Actor-92% :Tactician-75% :Storyteller-67%:
Specialist-67% :Power Gamer-42% :Butt-Kicker-33% :
Casual Gamer-8%


GAMERS Profile
Jibbajibba
9AA788 -- Age 45 -- Academia 1 term, civilian 4 terms -- $15,000

Cult&Hist-1 (Anthropology); Computing-1; Admin-1; Research-1;
Diplomacy-1; Speech-2; Writing-1; Deceit-1;
Brawl-1 (martial Arts); Wrestling-1; Edged-1;

OldGuy2

#31
Quote from: Silverlion;325025I really wish I could find a scan of the one page of Alan Moore's Top Ten, where the robot-racist Shock Headed Peter, is made fun of by their new officer the robot Joe Pi...for feeling up his retarded back woods cousin. (the soda machine when it didn't dispense his soda...:D)


The Great Comic Book Heroes Blogspot.ca - Joe Pi from Top Ten  Page 12

PS: Gack.  Necrothreading, sorry.  New to Forum.

Omega

Quote from: Koltar;325944This is to all of you referencing how unlikely the AI ideas are.....

According to page 511 of GURPS 4/e in the Tech Level descriptions, Artificial Intelligence will become commonplace sometime between 2026 and 2070. Thats TL 9.

Also, TL 10 is designated the Robotic Age and has a tentative start of 2070.
Thats only 61 years away - within the possible lifetime of many of us that post on here.

- Ed C.

FYI. I've mentioned this a few times. As of the late 90s a friend of mine had on a MUD a learning AI he was developing that posed as a player. It moved about on its own. (Though I do not know the extent of its actions) and would strike up conversations and help players out. It was pretty good at it too.

Around 2010 saw a simmilar one on Second Life. This one was free moving. (it searched for players present and maps to jump to) and was another leaning AI that builds a conversation database.

Mind you neither of these are sentient. But they are able to converse in a darn good semblance of.

Omega

Quote from: OldGuy2;951933The Great Comic Book Heroes Blogspot.ca - Joe Pi from Top Ten  Page 12

PS: Gack.  Necrothreading, sorry.  New to Forum.

Thats ok. Its an interesting thread I missed.

DavetheLost

Quote from: Ian Absentia;325237Yeah, players can be freaks like that.  Ask me about "Baron Pinto von Runnabout" some time.Who programmed this robot to have choices?  I'm telling you, man -- I'd be hitting her upside the head with B.R.U.N.O. like yesterday's haddock.  B.R.U.N.O. doesn't "want" anything.  B.R.U.N.O. doesn't make "choices".  B.R.U.N.O. follows orders to the best that its programming and mechanical engineering will allow.

Okay, for real emotional impact, I'd play B.R.U.N.O. like an autistic child.  The player spends time and effort teaching the robot to behave like an autonomous, self-aware being, probably much to the player's satisfaction.  Then, sooner or later, I'd drop a scene on her that underscores that the development of autonomous behavior is merely the robot's interpretation of what it assumes are her commands.

!i!

Why punish a relatively new RPG player for actually playing in character and having an unexpected (in character) emotional response to something in game?

Ratman_tf

Part of the problem is the anthromorphization of robots. Wall-E and The Brave Little Toaster shows that with the right cues, people can empathize with a box.
This is big part of the design philosophy behind Cozmo.

[video=youtube_share;aVpz8yBBKO0]https://youtu.be/aVpz8yBBKO0[/youtube]

Star Wars is a good example of how the idea is not consistent.

https://youtu.be/4O0-dCLFadQ

Why would someone design, build and program a robot that could be tortured? The answer is, they very likely wouldn't. That scene exists to show that Jabba and his minions are bad, and that C-3P0 and R2-D2 are in a dangerous place. It's not a statement on AI, but a device to increase tension in a story.

But you can nerdwank it by saying that Droids in Star Wars are learning machines, and pain is a great teacher. A robot that feels pain when it's damaged is likely to avoid being damaged. Etc.
How would a human know that Droid is in pain? Because it's programmed to squeal like a human or animal. Much more intuitive than a digital readout or a status message, that might not be noticed until it's too late to save your droid.

So are Droids in Star Wars aware like humans are? Or are they designed to emulate awareness to facilitate their use by human(oid)s?

I'm not going to give a definitive answer, because there is none in the movies. It's all storytelling device, and not a documentary on AI behavior.
I will say that, for myself, I consider the Droids of Star Wars to be as aware as animals but not necessarily as aware as human beings.

Though a character reacting to a Droid like it's a person, is perfectly cool role-playing. :)
The notion of an exclusionary and hostile RPG community is a fever dream of zealots who view all social dynamics through a narrow keyhole of structural oppression.
-Haffrung

Omega

Quote from: Ratman_tf;952476Star Wars is a good example of how the idea is not consistent.

https://youtu.be/4O0-dCLFadQ

Why would someone design, build and program a robot that could be tortured? The answer is, they very likely wouldn't.

Um... You mean in a setting where people are shown to consistently get pleasure from tormenting anything? Of course some loons are making torturable robots. Or modding them. Others are probably just the equivalent of exploits. This robot was designed with sensetive footpad sensors that when you do this or this to them creates the cybernetic equivalent of pain.

And also it seems all these robots are being built sentient "just because" Or possibly a byproduct of the complexity of their jobs. Also if you have sentient machines. Installing pain sensors is one more layer of subjugation.

So yeah. Star Wars setting is mean even to the robots.

3rik

#37
Quote from: Ratman_tf;952476Part of the problem is the anthromorphization of robots. Wall-E and The Brave Little Toaster shows that with the right cues, people can empathize with a box.
This is big part of the design philosophy behind Cozmo.

After clicking some of the Cozmo videos it seems rather boring, not particularly interactive at all just noisy.
It\'s not Its

"It\'s said that governments are chiefed by the double tongues" - Ten Bears (The Outlaw Josey Wales)

@RPGbericht

AsenRG

Quote from: Koltar;324821I forgot the adjective - of course I was referring to human-similiar shaped robots as we've seen in countless, movies, TV shows, comic books and Science Fiction books over the years.


- Ed C.
"Android", as in the title of the thread, already assumes a human form:).

Quote from: Spinachcat;325199If humanity hasn't worried much about the enslavement of other human beings, why ever worry about a robot?
Yeah, this. If anything, most people would care even less* about a non-human, even if it was sapient, IME. (I most certainly wouldn't, not before all humans were free to live their lives as they please...which means "never", yeah).
Second, why is it supposed to be wrong? If you don't want your robots suffering, there's a simple solution: don't give them feelings! They don't need them to do their jobs, and in fact it can be argued it would interfere with said jobs).

*And before you object that you care deeply, here's a reality check for you: do you even know how many slaves are there in present-day Mauritania? When did you last do anything about it? When did you last do anything about all the people living in slave-like conditions?
BTW, my answers are: yes, I do know; and 2015, when I last donated to a charity fighting agaisnt forced labour (but I still wouldn't bat an eye at androids being crushed in a hydrolic press - they aren't human, or even alive).
And if you answers are worse than that...consider the reality check failed.

Quote from: jibbajibba;326382Interestingly (well I say interesting...) the word Robot was penned by the Czech playwright Capek to mean an artificial person. His robots were not mechanical and achieved sentience (RUR - 1920ish Karel Capek robot derived from the Czech word robota meaning labour)

Therefore strictly speaking robots by definition really have to look like people.
Actually, no, "robot" comes from the Czech for "(forced) labour". They wouldn't be robots if we didn't make them work;)!
What Do You Do In Tekumel? See examples!
"Life is not fair. If the campaign setting is somewhat like life then the setting also is sometimes not fair." - Bren

Simlasa

Quote from: AsenRG;952618"Android", as in the title of the thread, already assumes a human form:).
But if I can talk to my toaster, and if it can respond back... keep up ersatz conversation at about the same level as the average bartender (which computers can already do), then I'm going to feel a whole lot worse if I let her get full of crumbs... and be really hesitant to throw her in the bin on a whim to by a shiny new one. Unless, I can transfer her 'personality' and voice into the new one.

AsenRG

Quote from: Simlasa;952622But if I can talk to my toaster, and if it can respond back... keep up ersatz conversation at about the same level as the average bartender (which computers can already do), then I'm going to feel a whole lot worse if I let her get full of crumbs... and be really hesitant to throw her in the bin on a whim to by a shiny new one. Unless, I can transfer her 'personality' and voice into the new one.

Your toaster is unlikely to be an android:).

Also, chat bots can keep an ersatz conversation. Doesn't mean I'd hesitate to delete them, because once again, they aren't human;).
What Do You Do In Tekumel? See examples!
"Life is not fair. If the campaign setting is somewhat like life then the setting also is sometimes not fair." - Bren

Simlasa

Quote from: AsenRG;952627Your toaster is unlikely to be an android:).
All you'd have to do is put a face on it, even an inert one, and people would bond with it all the more.

QuoteAlso, chat bots can keep an ersatz conversation. Doesn't mean I'd hesitate to delete them, because once again, they aren't human;).
But people are likely to form attachments to them anyway, hold onto them longer, unless some provision is implemented to dissuade that.

Cave Bear

Humans are animals. The parts of our brains that govern violence and aggression are right next to the parts that govern sex and pleasure, and the frontal lobe does nothing more than justify our behaviors after the fact.

Our corporate masters will sell us robots as socially acceptable targets for violence and rape. They'll teach AI's to tailor personalized marketing strategies to each and every one of us; the better to manufacture need. They'll sell our own sadism back to us. They'll sell it with promises of comfort, or social status. They'll sell it to us with drugs if they have to. They'll drill cortical modems into our brains and fill our dreams with their commercials customized to our subconscious profiles.
And they'll program our android slaves to want it. They'll program our synthetic victims to take the abuse. They'll program our synthetic victims to come back to the abuse. To feel as though they deserve it. To feel as though they want it. They'll program our droids to be good little victims.
Or maybe they'll program us to be the victims. They'll stick chips inside our skulls. They'll make us the servile androids. They'll rape it into us.

Ratman_tf

Quote from: Cave Bear;952634Humans are animals. The parts of our brains that govern violence and aggression are right next to the parts that govern sex and pleasure, and the frontal lobe does nothing more than justify our behaviors after the fact.

Our corporate masters will sell us robots as socially acceptable targets for violence and rape. They'll teach AI's to tailor personalized marketing strategies to each and every one of us; the better to manufacture need. They'll sell our own sadism back to us. They'll sell it with promises of comfort, or social status. They'll sell it to us with drugs if they have to. They'll drill cortical modems into our brains and fill our dreams with their commercials customized to our subconscious profiles.
And they'll program our android slaves to want it. They'll program our synthetic victims to take the abuse. They'll program our synthetic victims to come back to the abuse. To feel as though they deserve it. To feel as though they want it. They'll program our droids to be good little victims.
Or maybe they'll program us to be the victims. They'll stick chips inside our skulls. They'll make us the servile androids. They'll rape it into us.

Well, that was cheery!
The notion of an exclusionary and hostile RPG community is a fever dream of zealots who view all social dynamics through a narrow keyhole of structural oppression.
-Haffrung

Tristram Evans

I have no ethical issues with robot slavery.