1. Hubardo

    Hubardo Contributor Contributor

    Joined:
    Feb 22, 2014
    Messages:
    1,079
    Likes Received:
    574

    Could giving empathy to a robot make it violent?

    Discussion in 'Character Development' started by Hubardo, Jul 7, 2016.

    I'm toying with an idea where a robot - fully human looking AI - is given empathy after having none. Psychologically it's a bit of a question about what if you could push someone on the sociopath spectrum all the way into the empath spectrum in extremes, but really fast, with side effects. The main twist I'm trying to go for is that whereas people think that such a robot pre-empathy is dangerous because it doesn't "care," that once it gets empathy that's when it becomes dangerous. The logic may not hold up, but I'm trying to make it.

    So once the robot gets empathy, it gets these clusters of mirror neurons (one theoretical basis in neurobiology for why humans experience empathy, develop a 'theory of mind' --> intentions, model behavior), and it can now feel what others are feeling. It now develops self-empathy, and a more fully integrated self-concept. I may be using the word empathy incorrectly or stretching it semantically as a device. Not sure if it matters.

    Point is, I want to get to the point where empathy is then viewed by the robot as a kind of virus, because now that it feels what others are feeling, and it experiences its own feelings, it becomes inefficient and ineffective. It now has to navigate relationships and emotions, essentially making it human. It develops then a ironic kind of hatred and shame, which leads it to become violent*. Wanting to subvert the idea of empathy and caring being pinnacles of human integrity. Looking for feedback on this concept.

    Thanks

    *Edit to add: It becomes violent toward people who it now views as a threat, because it now experiences fear, and senses (or misperceives?) malicious intentions from others. It isn't indiscriminately violent. although maybe that could be worked into it.
     
  2. Dark Severance

    Dark Severance Member

    Joined:
    Apr 29, 2016
    Messages:
    33
    Likes Received:
    23
    Someone with psychopathy tendencies is usually someone who has what has been identified as a "empathy switch". Although they display a lack of empathy, they do actually possess it unlike earlier research which indicated the opposite.

    A psychopath brain does shows less empathy than mentally healthy individuals while watching others experience pain or affection. When they are then asked to empathize or imagine how the other person feels, they appear to show normal levels of empathy, suggesting they have the ability to understand another's feelings and thoughts but they may be repressed in these individuals rather than completely missing.

    It is plausible. The disconnect for me is that there is a difference between artificial intelligence and true intelligence.
     
  3. Hubardo

    Hubardo Contributor Contributor

    Joined:
    Feb 22, 2014
    Messages:
    1,079
    Likes Received:
    574
    In my sci fi imagination, the distinction is more like synthetic vs organic human. And I'm not the first to suggest that organic people will fear and demonize synthetic people. And that as synthetic people sort of 'surpass' organic people by various measures, that warlike stuff a' starts a' brewin'. Old idea, but I've always been drawn to it. ..but again the intrigue for this bit is the question of whether empathy is actually what holds us back. The idea is very 'troll,' but it's a story so whatever.
     
  4. Iain Aschendale

    Iain Aschendale Lying, dog-faced pony Marine Supporter Contributor

    Joined:
    Feb 12, 2015
    Messages:
    18,851
    Likes Received:
    35,471
    Location:
    Face down in the dirt
    Currently Reading::
    Telemachus Sneezed
    It could work if the AI's empathy was limited to one person or one group. Don't know if you saw Chappie, but the robot in that more or less imprinted on a couple of low-life thugs, trusted them implicitly, and was willing (for a while) to commit acts of violence on their behalf. The robot could also view its own empathy as "authentic", but human empathy as insincere and fake, after viewing something like a charity scam.

    Dunno, just my 2yen.
     
    Sifunkle and Hubardo like this.
  5. Zick

    Zick New Member

    Joined:
    Jul 8, 2016
    Messages:
    8
    Likes Received:
    4
    If the robot feels empathy is it going to have a wide range of emotions that come with that package. Once a robot can think and feel I would imagine that it would realize that it has been given a raw deal. Since robots are basically slaves, that are harder to kill, with fantastic computing abilities, I would imagine they would rebel. I belive they would turn to violence, but only because they saw no other option.
     
  6. Sifunkle

    Sifunkle Dis Member

    Joined:
    Aug 4, 2014
    Messages:
    478
    Likes Received:
    586
    I can follow the internal logic of what you've described, which I think means you'll be able to plausibly write it in a story :) Your process isn't what my mind immediately jumped to though: like @Zick I thought that the empathy would be a prelude to the whole gamut of emotions, which could be dangerous if the AI had suddenly been given them without the usual external context in which our emotions develop. I think you'd be opening the door to robots committing 'crimes of passion'.
     
  7. Nightstar99

    Nightstar99 Senior Member

    Joined:
    Jul 8, 2013
    Messages:
    255
    Likes Received:
    137
    Interesting idea, I would say though that your robot is having a very human reaction to empathy.

    One piece of computer theory you might want to look into is something I was reading a while back that people are efficient at making decisions not due to analytical thought but due to feelings, especially fear.

    E.g. you log on to your email and there are 20 mails from clients complaining about something and one from your boss. Your boss' email gives you that "oh my god" feeling so thats the one you clear first.

    The theory was that there needs to be some way to build this into machines.
     
  8. newjerseyrunner

    newjerseyrunner Contributor Contributor Contest Winner 2022

    Joined:
    Apr 20, 2016
    Messages:
    1,462
    Likes Received:
    1,432
    I would think so. There is certainly a precedent for that sort of thing in science fiction. Brent Spiner gave a fantastic depiction of both sides of that coin in the Star Trek TNG episode "Datalore." If you have Netflix, I highly recommend it. He plays two robots who were constructed identically except for the fact that one had emotions and the other didn't. The emotional one began to saw himself as better than the humans and eventually betrayed them.
     
  9. Hubardo

    Hubardo Contributor Contributor

    Joined:
    Feb 22, 2014
    Messages:
    1,079
    Likes Received:
    574
    Yeah so the thing is, in reality, emotions actually serve a very 'logical' function for humans. Empathy serves the function of social bonding, which helps keep social capital in groups. Fear helps activate the nervous system if there is a threat. Anger activates further if defense is needed. Even sadness helps reflect on what is meaningful - maybe not a hugely robotic emotion... but this idea of an either/or emotions vs logic thing is kind of old. We know too that left brain / right brain processes are in communication with each other all the time, and the more integrated the sides are, the healthier the person typically is.

    But I feel like for this story's purposes, I have to assume the designers of these robots had thought of this and had plugged in some decision making capacity based on logical ways to set priorities. The robot checks the 20 emails and asks itself what its top priority is. There may be a way to program a robot to understand a wide array of costs and benefits to certain behaviors as well, and so choices would be made on maximum benefit and minimum consequence.
     

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice