1. ProcrastinatingDreamer

    ProcrastinatingDreamer Member

    Joined:
    Aug 18, 2015
    Messages:
    24
    Likes Received:
    5

    A plot hole in my backstory

    Discussion in 'Plot Development' started by ProcrastinatingDreamer, Feb 27, 2017.

    For the past several months, I've been slowly developing a science fiction story where humans assimilate into an android-ruled society by amputating one or more of their limbs and replacing them with a robotic prosthetic.

    The main story itself is focused on how the poor survive in this kind of world, and while I think that story is developing nicely, the backstory for how this society came to be is unfortunately flawed ( and kind of generic if I can be honest. )

    Basically, what I have is a timeline where robotics developed far earlier in history, which unsurprisingly leads to a robot uprising far earlier in history.

    Now the humans in this world didn't build functioning AI without taking freewill into consideration. As such, androids were equipped with a will-inhibiting disc, providing humanity with a high-functioning servant that didn't ask why his only purpose in life was to open doors and take coats.

    But due to a malfunction ( or perhaps human tampering; I'm still not sure of the reason,) the discs ceased to work and the robots turned on the human race. Then you can guess what followed: humans go to war trying to defend themselves but end up losing to the more technologically advanced androids. With no other alternatives beside extinction, mankind surrenders.

    Aside from the cyborg assimilation I previously mentioned, the robots are merciful and allow the remaining humans to live.

    Not as pets or batteries, mind you, but as normal citizens.

    So the biggest plot hole I found with this is the robots showing mercy.

    What would a highly efficient being need with a human? What reason would they see for allowing humanity to survive instead of just eradicating what's left of us?

    With the main story, my stance is that humans can't exist within the new society and end up needing to develop their own economy separate from the androids. That's why I feel a reasonable explanation is necessary.

    Also, how would humans in this world develop AI sooner than we did?

    ( I'm sorry, I realize this post is ridiculously long. Thank you if you managed to read the whole thing. )
     
  2. ChickenFreak

    ChickenFreak Contributor Contributor

    Joined:
    Mar 9, 2010
    Messages:
    15,262
    Likes Received:
    13,084
    To me, that ties tightly to the question of why the robots "turned on" the human race. Is that necessary? Or is it possible that the robots just showed independent action and the human race decided to destroy them because they feared that the robots would turn on them, and the robot side of the war was robotic self defense?

    If that's the answer, then the robots allowing the remaining humans to live isn't a change of policy. Then you can have milder reasons for the humans to be useful--for example, maybe there are human skills and abilities that the robots haven't duplicated yet, so they find the humans useful?
     
  3. izzybot

    izzybot (unspecified) Contributor

    Joined:
    Jun 3, 2015
    Messages:
    2,419
    Likes Received:
    3,884
    Location:
    SC, USA
    Why can't they be compassionate? Maybe they could recognize that not all of the humans screwed them over; future generations of humans would be innocent and even the shitty people aren't completely bad. Humans don't have to be 'useful' if the AIs recognize free existence as a universal right - they just need to be contained and curbed a little, to keep them from repeating their past mistakes (humans are good at that). It'd potentially make the AIs sort of a menacing but ostensibly benevolent 'ruling class', and I don't know if that fits with your setting, but it's what I'm thinkin'.
     
  4. ProcrastinatingDreamer

    ProcrastinatingDreamer Member

    Joined:
    Aug 18, 2015
    Messages:
    24
    Likes Received:
    5
    A very valid point and I think it would actually tie in well with my setting as I tend to imagine these robots behaving quite similarly to humans.

    If they have a culture with their own brand of arts, language, and even games ( which they do, ) then I suppose empathy for humanity isn't off the table.
     
    Simpson17866 likes this.
  5. ProcrastinatingDreamer

    ProcrastinatingDreamer Member

    Joined:
    Aug 18, 2015
    Messages:
    24
    Likes Received:
    5
    An interesting perspective. I admit that technology turning on its master is done to death at this point, so giving the robots reasons beside " They enslaved us " is probably a good idea.

    I've also wondered about humans finding niches in this society. Maybe jobs that would damage electronics such as marine work?
     
    Simpson17866 likes this.
  6. J.E. Kirkland

    J.E. Kirkland Member

    Joined:
    Feb 23, 2017
    Messages:
    31
    Likes Received:
    6
    The first thing that comes to mind for a plausible explanation would be because humans created them. The AI may recognize if that part malfunctioned then other parts could. I see it as the AI not knowing if they have the capabilities to make their own repairs in the future. Plus another thing I'm wondering about is would the human continues their work with robots or would that cease?

    I'm new to science fiction (movies and books alike) so I hope this is helpful.
     
  7. EFF_FireFly

    EFF_FireFly New Member

    Joined:
    Mar 23, 2017
    Messages:
    14
    Likes Received:
    3
    Location:
    Eastern Time Zone, United States
    So, you could even still have a creepy overseer robot culture without any war at all. Consider the very fact that your robots or AI are not organic at all. Unless you place some sort of artificial lifespan on them, they would never age out like humans. I think it would be an interesting plot point to have robots slowly take over society. Like they slowly convince humanity to curb their own behavior by being slow moving and persuasive. Like: 2017- "Robot driven cars are an infringement on my rights! Grr." 3017- "OMG, what human drives? Robots are so much better qualified to drive machinery like cars. Vehicular manslaughter is a 1 in a million type of death!" blah blah blah.

    Maybe the problem of mercy is the type. A human realizes they don't have free will, the overseer is awful. but the machine thinks that the human is well taken care of. I hope that helps.
     

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice