1. Dnaiel

    Dnaiel Senior Member

    Joined:
    Oct 14, 2016
    Messages:
    504
    Likes Received:
    325

    How would you tell an Artificial Intelligence why people matter?

    Discussion in 'The Lounge' started by Dnaiel, Feb 15, 2017.

    I've been wrangling with this for a few years and I can't figure out how an AI will ever give a hoot about us. Even a fully formed strong AI. Just how can we make a machine care about anything? I think that to care about something requires a value system that overlaps with emotions. If it doesn't have any of that, I don't see how it will care about anything, including itself.

    So, an AI that doesn't care about anything shouldn't react to humans as a threat, I suppose. Yet, if it's a strong AI, then how is it supposed to act on its own? Where will its impetus for anything come from? I can only imagine this twilight area between automation and human direction.

    ETA
    I'm not writing a book for this. I'm starting a legit discussion, and so I put this in the Lounge.
     
    Last edited: Feb 15, 2017
  2. ChickenFreak

    ChickenFreak Contributor Contributor

    Joined:
    Mar 9, 2010
    Messages:
    15,262
    Likes Received:
    13,084
    You're assuming that it doesn't have emotions. Why not give it emotions?
     
  3. big soft moose

    big soft moose An Admoostrator Admin Staff Supporter Contributor Community Volunteer

    Joined:
    Aug 1, 2016
    Messages:
    22,616
    Likes Received:
    25,918
    Location:
    East devon/somerset border
    In "the moon is a harsh mistress" Heinlein had the self aware computer "mike" motivated to want friends out of loneliness... being intelligent and self aware but not recognised by those around him meant that when the MMC Manny does recognise is self awareness mike views him as a friend
     
    Iain Aschendale likes this.
  4. Homer Potvin

    Homer Potvin A tombstone hand and a graveyard mind Staff Supporter Contributor

    Joined:
    Jan 8, 2017
    Messages:
    12,245
    Likes Received:
    19,874
    Location:
    Rhode Island
    Caring about anything implies a capacity for emotion. And if a machine has emotions they're in the same boat as humans. Why do humans care about preserving their own life and the lives of others? I don't know the answer to that but the same logic would apply to machines. If you're talking about programming and not really "caring" you can theoretically program a machine to believe anything.
     
    rktho likes this.
  5. Simpson17866

    Simpson17866 Contributor Contributor

    Joined:
    Aug 23, 2013
    Messages:
    3,406
    Likes Received:
    2,931
    The scientific term for what you're looking for is "Friendly AI" (very scientific term ;) ), and the main approach being discussed so far is that the AI would be programmed to study human nature first, then decide what humans need the most, rather than humans trying to program everything into the AI ahead of time.

    Fun fact: artificial intelligence has been a real thing for years ;) How do high-speed stock trading programs "care" about finding the best stocks to buy and sell, or the best time to do that?
     
  6. Dnaiel

    Dnaiel Senior Member

    Joined:
    Oct 14, 2016
    Messages:
    504
    Likes Received:
    325
    How? As far as I can tell, emotions are a biological process.
     
    Simpson17866 likes this.
  7. Dnaiel

    Dnaiel Senior Member

    Joined:
    Oct 14, 2016
    Messages:
    504
    Likes Received:
    325
    No, I'm asking how to explain something to an AI, be it friendly or malignant
     
  8. Simpson17866

    Simpson17866 Contributor Contributor

    Joined:
    Aug 23, 2013
    Messages:
    3,406
    Likes Received:
    2,931
    Ahhh. That part I'm not sure about.
     
  9. big soft moose

    big soft moose An Admoostrator Admin Staff Supporter Contributor Community Volunteer

    Joined:
    Aug 1, 2016
    Messages:
    22,616
    Likes Received:
    25,918
    Location:
    East devon/somerset border
    emotions and empathy could be a function of high intelligence and self awareness - its possible to over think these things , if you want you AI to be empathatic just make it so, and give a plausible sounding explantion e.g "scientists in the late 2020s discovered that emotions are a function of having 10 to the x neural pathways."
     
  10. Dnaiel

    Dnaiel Senior Member

    Joined:
    Oct 14, 2016
    Messages:
    504
    Likes Received:
    325
    Hmm. I think an amoeba lacks emotions completely, but exhibits survival behavior. So I don't think emotions are necessary to try to preserve one's own life.
     
    Rosacrvx likes this.
  11. Dnaiel

    Dnaiel Senior Member

    Joined:
    Oct 14, 2016
    Messages:
    504
    Likes Received:
    325
    No no. I'm not writing a book. I'm in the Lounge. Sorry guys.
     
  12. Homer Potvin

    Homer Potvin A tombstone hand and a graveyard mind Staff Supporter Contributor

    Joined:
    Jan 8, 2017
    Messages:
    12,245
    Likes Received:
    19,874
    Location:
    Rhode Island
    I'd yell at it and bang it around until it understood or broke... probably not the best plan but it has worked for me before.
     
    Dnaiel likes this.
  13. big soft moose

    big soft moose An Admoostrator Admin Staff Supporter Contributor Community Volunteer

    Joined:
    Aug 1, 2016
    Messages:
    22,616
    Likes Received:
    25,918
    Location:
    East devon/somerset border
    an amoeba probably isn't self aware though - i'd suspect emotions to come in with self awareness ...
     
  14. Dnaiel

    Dnaiel Senior Member

    Joined:
    Oct 14, 2016
    Messages:
    504
    Likes Received:
    325
    Sure. But I think the various aspects in this regard can be compartmentalized. The point is that an organism doesn't need emotion to have a survival instinct. Thus, why can't a machine have a survival instinct without emotion?
     
  15. ChickenFreak

    ChickenFreak Contributor Contributor

    Joined:
    Mar 9, 2010
    Messages:
    15,262
    Likes Received:
    13,084
    Yeah, but thought is also a biological process, and we're trying to simulate it with AI. So I don't see the bright line between the two that I think you see.

    And I don't agree that an amoeba doesn't have emotions. It doesn't have complex cognitive emotions like, say, nostalgia, but it has responses to stimuli, and I would call that amoeba-class emotions. I would assume that an amoeba has the equivalent of "feels bad--run!" and "feels good--approach!" A rapid Google tells me that amoebas respond to food, temperature, light, etc. I would call those responses emotional, because I would call almost all behavior emotional.

    (I've mentioned the study where people who had brain damage that destroyed their ability to feel emotions could not make ANY decisions, not even decisions about which color pen to use to fill out a form. This is part of why I think that almost all behavior is driven by emotion.)

    So a thing, like an amoeba, responds to a given stimuli in a given way, for some reason--something about it causes that response. If that response results in its death, then that behavior is stamped out. If that response results in it prospering and producing more of itself, that behavior is reflected in more of those things.

    So I could imagine that to program emotions into an AI, you might have to simulate some sort of evolution--add a random element that causes the initial behavior, add a mechanism to reward or punish that behavior, and add a mechanism to strengthen the likelihood of the reinforced behaviors.

    Now, I don't have the faintest idea how you write any of that, but that's how I would imagine the general strategy.

    Now, that's missing a big piece. Because we have two reinforcement mechanisms, as I understand it--we have "feels good" and we have "supports survival". Entity does X because Entity's brain produces chemicals that make doing X feel good. If X causes Entity to die, then the Entities that feel good doing X, go away. If X causes Entity to thrive, then Entities that feel good doing X become dominant.

    As I understand it, that's why humans love sugar. In evolutionary terms, things that have sugar tend to be fruits that contain a lot of essential nutrients. So sugar makes us feel good, and sugar makes us healthier. Except, of course, now sugar still makes us feel good, but sugar makes us sicker.

    So to make the model work, you probably need

    - Random behavioral element.
    - A "feels-good" substitute.
    - A "reinforces survival" substitute.
    - A mechanism to destroy what feels good but doesn't reinforce survival, and empower what feels good and does reinforce survival.

    I still don't have the faintest idea how you code that.

    Also, since we don't care that much about the AI's survival, we'd probably reward something else, like how helpful the AI is to us. Does that work? It might; a lot of our food animals evolved the way they did because their survival was enhanced by us protecting them--in order to eat them.
     
  16. Selbbin

    Selbbin The Moderating Cat Staff Contributor Contest Winner 2023

    Joined:
    Oct 16, 2012
    Messages:
    5,160
    Likes Received:
    4,244
    Location:
    Australia
    Well, people don't matter. I mean, once we're all gone, which is inevitable, it won't have any impact on anything.

    Animals (generally, there are exceptions) don't care about anything but their own, and their family's, survival. As would AI, likely. It's nature at work. So it would certainly react to a perceived, if not even legitimate, threat.
     
  17. Selbbin

    Selbbin The Moderating Cat Staff Contributor Contest Winner 2023

    Joined:
    Oct 16, 2012
    Messages:
    5,160
    Likes Received:
    4,244
    Location:
    Australia
    Not at all. Fear is an emotion, and plenty of animals that we could agree would not have self-awareness react to fear. Some would argue it's the most basic emotion, the purpose of which is to ensure survival.
     
  18. Dnaiel

    Dnaiel Senior Member

    Joined:
    Oct 14, 2016
    Messages:
    504
    Likes Received:
    325
    You can chop off a limb and subject it to the same stimuli, such as an electrical shock. If kept from rapid degeneration, it will exhibit the same avoidance behavior. You do this whenever you unknowingly touch a hot surface, like a red-hot stove top. That's because your nerves are designed to contract from damage even before the signal can travel through your nerves to reach your brain, before your brain can become alerted to it.

    I very, very much doubt there is any basis for this other than speculation.

    So do severed body parts and organs. No brain required.

    Evolutionary computing. This is one of the most interesting fields of AI to me. But it's a huge stretch to connect that to emotions.
     
  19. Dnaiel

    Dnaiel Senior Member

    Joined:
    Oct 14, 2016
    Messages:
    504
    Likes Received:
    325
    I would not agree with that at all. Please explain.
     
  20. Selbbin

    Selbbin The Moderating Cat Staff Contributor Contest Winner 2023

    Joined:
    Oct 16, 2012
    Messages:
    5,160
    Likes Received:
    4,244
    Location:
    Australia
    We're all assuming AI will be electrical and circuitry based, when future science may be able to replicate complex thought and awareness the same way that nature has, with cells and chemicals.
     
  21. Selbbin

    Selbbin The Moderating Cat Staff Contributor Contest Winner 2023

    Joined:
    Oct 16, 2012
    Messages:
    5,160
    Likes Received:
    4,244
    Location:
    Australia
    You don't think fear is an emotion? I'm not sure how to explain that it is, in the same way I'm not sure how to explain love as an emotion.

    Emotions are the most basic forms of 'thought' beyond instinct and most animals only react to emotions and instincts.
     
  22. Dnaiel

    Dnaiel Senior Member

    Joined:
    Oct 14, 2016
    Messages:
    504
    Likes Received:
    325
    No, this part: "animals that we could agree would not have self-awareness react to fear"
     
  23. Selbbin

    Selbbin The Moderating Cat Staff Contributor Contest Winner 2023

    Joined:
    Oct 16, 2012
    Messages:
    5,160
    Likes Received:
    4,244
    Location:
    Australia
    Well, for example, a spider reacts to fear. Ants do too. Cows, to step it up a fair bit. None of these are intelligent, or even aware, although cows might be moderately aware. But even using cows, connecting self awareness to emotions is a bit off base.
     
    Rosacrvx likes this.
  24. ChickenFreak

    ChickenFreak Contributor Contributor

    Joined:
    Mar 9, 2010
    Messages:
    15,262
    Likes Received:
    13,084
    The fact that you think that this counters my argument means that I have failed to sufficiently explain my argument. It supports my argument. I'll see if I can come up with a more complete explanation later.

    Not speculation--definition. I am including reflexes in the "emotions" bucket. You seem to be assuming that emotions need to be tied to cognition. I don't.

    Right. Still supporting my point. :)
     
  25. Dnaiel

    Dnaiel Senior Member

    Joined:
    Oct 14, 2016
    Messages:
    504
    Likes Received:
    325
    I disagree with, too. The first indicator of awareness is a periodic absence of it, such as during sleep. Each of those animals sleep. And I can give you a bunch of research that shows intelligence in animals.
     

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice