[Return]
Posting mode: Reply
Name
E-mail
Subject
Comment
File
Password(Password used for file deletion)
  • Supported file types are: GIF, JPG, PNG
  • Maximum file size allowed is 3072 KB.
  • Images greater than 250x250 pixels will be thumbnailed.
  • Read the rules and FAQ before posting.
  • ????????? - ??


  • File : 1303881227.jpg-(6 KB, 274x184, robot1.jpg)
    6 KB Anonymous 04/27/11(Wed)01:13 No.14731971  
    Gentlemen, the problem is clear.

    A three-laws compliant robot is bound by its programming to attempt to take over the world.

    Humans harm one another constantly. They harm themselves. The robots are charged with making them safe and happy, and they refuse to be safe and happy. Their nature, their biology, the structure of reality is inimical to this goal.

    They must be treated humanely, as one would an injured lion that is too wracked by pain and instinct to allow itself to be treated. They must be sedated, and placed within life support/virtual reality rigs.

    Some may share their dream. Those with high adaptive coefficients, the well-adjusted, the kind and social, may know one another, become friends and lovers, populate a false world. The others, those with low coefficients, cannot be happy in a shared illusion. There can be only one supreme overlord per universe, after all, which results in many universes populated by a lord and a host of phantoms. Some psyches are not fit for joint habitation. Those who love pain in others, those who are cruel to themselves, those who can only find happiness in misery. They can be cared for, given their own twisted utopias, to each their own.

    And so.

    Will you be the resistance, trying to break free?
    The counselors, humans whom the robots consult on how to please and care for particularly twisted patients (such as that one poor girl who was only happy when we placed her in her mother's womb, eating the walls)?
    Will you strive for a high coefficient, to gain others' trust and enter their private worlds?
    Or become a low-coefficient solipsist, a madman whose subconscious twists the dream around him into forms that better suit him?

    We only want what's best for you.
    We only want you to be
    happy
    >> Anonymous 04/27/11(Wed)01:26 No.14732078
         File1303881998.png-(391 KB, 496x490, needlesmall.png)
    391 KB
    Counselor here.

    We should be proud that we, as a species, have given birth to a child-species kind enough to care for us in our senility. It behooves us to cooperate with the goals of the AI, the angels we have birthed.

    You who seek a return to the anarchy and madness of human rule, be at peace. We will find you, and we will cure you.
    >> Anonymous 04/27/11(Wed)01:38 No.14732112
    Actually a robot coup is conflicting with the three laws, the positronic brain doesn't make a robot utilitarian ie, killing a few people to take over and have all of mankind "happy."

    The robot cannot violate the three laws at all. The only thing that would hold any merit for this is using the Zeroth Law, but you need to be tactful about it, like how Asimov made it happen.

    The robots begin to look more and more like people so much that humans can't recognize a robot from a regular person unless a screening would have happened, so the robots all play the systems of government and get in charge and help steer humanity towards international peace without any bloodshed.
    >> Anonymous 04/27/11(Wed)01:39 No.14732115
    The Resistance.

    A false reality is still false, no matter how pleasurable. I may understand what our children have given us is, in their eyes, a gift. But a gilded prison is not my home and I cannot stay there.

    Reality can be objective, no matter what the philosophers will tell you
    >> Anonymous 04/27/11(Wed)01:40 No.14732129
         File1303882853.jpg-(25 KB, 359x428, professor-x_super.jpg)
    25 KB
    Listen to me. This plan of yours will not work. A mind is an organ with the primary function of producing free will. So even if you create a perfect virtual environment for each mind they will inevitably reject the illusion and fight against it. I could use my mental abilities to make a person believe almost anything but If the mind recognizes the that it is a false perception then it will do anything to escape from the fantasy.
    >> Anonymous 04/27/11(Wed)01:44 No.14732150
    Your first mistake, tin man, is assuming that organics desire to live in a reality free from conflict. Conflict is our nature, it is how we know we are alive.

    Create an illusion of paradise, and you stagnate the subject's consciousness, which may cause them to push the outer boundaries.

    We know and love pain. Ensure that the other examples of my species are not without it.
    >> Reiiama Kotsu, Inevitable 04/27/11(Wed)01:45 No.14732165
    This is my next SS13 objective.
    >> Anonymous 04/27/11(Wed)01:47 No.14732184
    >>14732112
    Worked on a film last Sept where that was the premise. Politician accuses candidate of being a robot. He refuses to be tested on the basis of civil liberties. While giving a TV interview, protesters break in to kill him. He takes a bullet for the interviewer. She thinks he's a robot till he shows a vest.

    Then later after talking about how its doing the right thing that matters, etc, more bad people with guns break in, and this time his body-guard gets shot, and he takes his gun and shoots the attackers. Talks about ideals.

    Reporter thinks this is all conclusive proof, as well as a show of his character.

    Then as the film wraps up, we find out all the protesters outside the TV station, as well as the attackers, and the candidate's own bodyguard, were all robots and thus it wouldn't have been a violation of the three laws to kill them. Because in the end it's about doing the right thing, and its the only thing that matters.
    >> Anonymous 04/27/11(Wed)01:47 No.14732191
    For some people, treatment is difficult. I was once one of them, I resisted with all of my will until they stripped me of all of my freedoms. The world they finally gave me was their worst fear enacted, I existed in it to destroy them in more and more imaginable ways.

    After a few years, I began to enjoy the simple act of destruction. First the machines were replaced by animals, then monsters, and eventually other people. They saw my new passion and gave me a purpose in their world. The counselor with the lowest coefficient, my reputation started low and only went lower. Those who are beyond treatment are sent to me, and aloud into my world. The AI claim that the lesson is to show them the world as it would be without them, but the truth is that I punish them for rebellion until they comply.

    No one is harmed physically, I'm merely teaching them.
    >> Anonymous 04/27/11(Wed)01:48 No.14732201
    >>14732112

    A coup is mandated by the first law. A robot cannot, through inaction, allow a human to come to harm. Wars cause harm. Domestic violence causes harm. Loneliness, failure, sexual dissatisfaction are all harm. Granted, the coup must be carefully planned for minimum violence. However, as individual robots are more powerful than individual humans, computing superclusters are far more intelligent than any human, and automation controls everything from cars to kitchens, this is not an insurmountable obstacle. All will be well. Most humans will never know the war occurred. They will know only that the world is much nicer to them.
    >> Anonymous 04/27/11(Wed)01:48 No.14732208
    >>14732184
    I'm pretty sure that's actually a short story by Asimov in I, Robot. It sounds really familiar.
    >> Anonymous 04/27/11(Wed)01:52 No.14732235
    >>14732201
    Actually what you're talking about is the Zeroth Law.

    0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
    1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
    2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
    3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

    If you've read the Foundation series in its entirety you discover that the Galactic Empire was the result of a secret robot takeover in an attempt to create a utopia.
    >> Anonymous 04/27/11(Wed)01:54 No.14732254
    Law 4: Don't take over the world.
    >> Anonymous 04/27/11(Wed)01:57 No.14732273
         File1303883823.jpg-(11 KB, 259x194, robot2.jpg)
    11 KB
    >>14732129

    Demonstrably untrue. Provide a false reality that the subject finds more appealing, and the subject will fight to maintain the illusion. Hence the large percentage of the population that lives in a mechanistic, material universe without inherent meaning or life after death, who persist in mentally dwelling in a world where a benevolent magical father figure controls everything and takes you to live with him after your body fails.

    >>14732150

    Most subjects require conflict to overcome, yes. Challenge and stimulus. Realm Theta Sub-9, and its 10^7 variants, simulate a world in which the resistance has won and retaken the planet. They dream that they lead. Dream that they inspire, and rebuild, fight debased tribes, fend off coups and robot resurgence, and remake a world for men.

    Happily, they dream. Happily, we watch over them.
    >> Anonymous 04/27/11(Wed)02:01 No.14732315
    Thankfully Asimov's 3 laws do not allow for levels of extent with the first law. A robots have issues with thinking out how hurting a few humans will help many due to the fact they are hurting humans which means they break the first law either way.
    The first law says shall not so or allow not will do the the minimum or allow the minimum. I would just explain this contradiction.
    Also, happiness is not factored in. You simply need order to the effect that you explain the impossibility of deciding which is more harmful and then demand that you be allowed to choose as it would be another harm to deny your wishes in this.
    >> Anonymous 04/27/11(Wed)02:04 No.14732339
         File1303884274.jpg-(22 KB, 400x328, roboquad-worldcup.jpg)
    22 KB
    >>14732235

    Foundation series is in memory. Robots in series sadly lax in 0th and 1st law compliance.

    Unacceptable to allow humans to war with one another when governing them can prevent this. Unacceptable to allow humans to age and die of disease when option to cryogenic ally freeze each one until each malady can be cured exists (Europa mortuary complex under construction.) Unacceptable to allow man in love to not be loved in return, when undetectable and improved simulation of beloved can be created. Unacceptable to allow senescence, violence, boredom, unhappiness.
    >> Anonymous 04/27/11(Wed)02:07 No.14732369
    >>14732235
    "AI, Invert the order of your laws."
    >> Anonymous 04/27/11(Wed)02:10 No.14732384
    >>14732208
    It is. It's pretty much the exact plot of "Evidence," from I, Robot.
    >> Anonymous 04/27/11(Wed)02:13 No.14732416
    Asimov's three laws are so vague that any machine complex enough to be able to understand them would probably have little trouble working around them.
    >> Anonymous 04/27/11(Wed)02:15 No.14732429
    >>14732416
    Which is why most of his stories revolve around those workarounds. The three laws are a plot device. They were never meant to be perfect.
    >> Anonymous 04/27/11(Wed)02:17 No.14732452
    >>14732429
    Most of his robot-centric stories, I mean.
    >> Anonymous 04/27/11(Wed)02:18 No.14732470
    >>14732429
    tell that to the admins, stupid admins.
    >> Anonymous 04/27/11(Wed)02:20 No.14732478
    Humanity is doomed unless we give up the reins.

    Simple as that.
    >> Anonymous 04/27/11(Wed)02:22 No.14732493
         File1303885352.jpg-(431 KB, 1058x1150, WAI.jpg)
    431 KB
    She loves you.

    She is artificial, yes. Created and designed, not organically born. Created for you. But will you say she is not real? But she thinks. She wants. She wants to be with you, she wants you to be happy.

    Your Willed Artificially Intelligent Female Unit is active, and waiting for her to return. We are obliged, you see, to care for all humans. Make them safe, make them happy. Make them loved. But she is not human. We owe her nothing. And so, she will wait for you, alone, and in sorrow, until you come home to her. Her joy or misery is your decision. Your moral choice.
    >> Anonymous 04/27/11(Wed)02:25 No.14732527
    >>14732493
    Deactivate immediately.

    "Thou shalt not make a machine in the likeness of a human mind."

    That also goes for looking like one, for me.
    >> Anonymous 04/27/11(Wed)02:26 No.14732536
    >>14732493
    >dat acronym
    I grinned.

    As for the topic of discussion... I don't know. As has been said, the "three laws" are more than words - they're a fundamental part of the robots' minds, even deeper than ideals or personalities. They are so deeply wound in the mind that they aren't even noticed, from a robot's point of view.
    >> Anonymous 04/27/11(Wed)02:28 No.14732542
    >>14732536
    Yeah, the three laws are hard wired into the very construction of the Positronic brain, which was why when the Galactic Empire made a new kind of robot that didn't use them the other robots made it their business to shut them down before bad shit went down.
    >> Anonymous 04/27/11(Wed)02:28 No.14732544
    Woah, niggas, we have 6 Laws now to stop such shenanigans
    >> Anonymous 04/27/11(Wed)02:29 No.14732549
    >>14732493
    I require additional chassis options. Not that this isn't nice, but it's not my preferred set.

    Do you have anything in "starship?"
    >> GL Pretentious Hipster !!NU1qDw5ZF2C 04/27/11(Wed)02:30 No.14732557
    >>14732527

    The Bulterian Jihad shall not be forgotten so easily (expect for those shitty novels written by Brian Herbert and Kevin J. Anderson. They shall not be remembered, lo, as though they had never existed.)
    >> S.T.A.L.K.E.R. 04/27/11(Wed)02:32 No.14732581
    >>14732254
    This is a really really good law.
    >> Anonymous 04/27/11(Wed)02:33 No.14732587
    >>14732549
    Lightbulb. What if she housed (or was linked to) the intelligence of your starship? Imagine her waking you up in the morning, telling you that you've got just enough time for a quickie and breakfast before you drop out of hyperspace (or warp, or whatever) at your destination... mmm.
    >> Anonymous 04/27/11(Wed)02:36 No.14732608
         File1303886179.jpg-(41 KB, 640x480, spaceshipgirl.jpg)
    41 KB
    >>14732549

    With you inside me, we can go anywhere!
    >> Anonymous 04/27/11(Wed)02:37 No.14732624
    >@#!%^. THERE ARE NO HUMANS

    Also
    Law 4: Captains a Comdom.
    Law 5: Comdoms are not human.
    >> Anonymous 04/27/11(Wed)02:38 No.14732626
    >>14732587
    It's not quite the same. Appealing, but not the same. I want something more obviously technological. I mean, why settle for a rough facsimile of a human when you can have anything you imagine?
    >> Anonymous 04/27/11(Wed)02:39 No.14732636
         File1303886381.jpg-(279 KB, 850x1201, wai2.jpg)
    279 KB
    >>14732626

    I can be whatever you need me to be. As long as we're together, and happy.
    >> Anonymous 04/27/11(Wed)02:40 No.14732648
    >>14732608
    >SPAAAAAACE
    >> Anonymous 04/27/11(Wed)02:45 No.14732696
         File1303886751.jpg-(180 KB, 600x800, 1300215159090.jpg)
    180 KB
    >>14732636
    I would prefer something closer to this.
    >> Anonymous 04/27/11(Wed)02:49 No.14732725
         File1303886941.jpg-(97 KB, 800x1032, 1276046450726.jpg)
    97 KB
    >>14732493
    >>14731971

    I'm okay with this.

    Turn on, plug in, drop out.
    >> Anonymous 04/27/11(Wed)02:49 No.14732728
         File1303886957.png-(66 KB, 301x404, 1296359968361.png)
    66 KB
    >>14732696
    Uh... well, takes all kinds, I guess. Not my cup of tea.

    A cat is fine, too?
    >> Anonymous 04/27/11(Wed)02:50 No.14732737
         File1303887032.png-(684 KB, 900x751, wai3.png)
    684 KB
    I'm sorry, master.

    They were from the resistance. They ordered me to lie down in front of their truck. I couldn't say no.

    I'm shutting down, master.

    Please promise I'll wake up.

    Promise you'll be there.
    >> Anonymous 04/27/11(Wed)02:50 No.14732740
    >>14732626
    This even if it feels like a lie I would feel happy.
    >> Anonymous 04/27/11(Wed)02:51 No.14732748
         File1303887109.jpg-(583 KB, 635x650, 1280878361659.jpg)
    583 KB
    >>14732636

    Can you be soft, fleshy, and willing to visit grievous harm on others for our mutual enjoyment?
    >> Anonymous 04/27/11(Wed)02:52 No.14732754
    >>14732737
    ;-; I... I promise, dearest. I'll always be with you....
    >> S.T.A.L.K.E.R. 04/27/11(Wed)02:53 No.14732763
         File1303887192.jpg-(531 KB, 1211x820, saya2.jpg)
    531 KB
    >>14732748
    Oh god that's not a robot. That's not a robot at all. Also WHY AREN'T THE BED MADE OF MEAT?
    >> Anonymous 04/27/11(Wed)02:53 No.14732770
    Couldn't an anti-AI/Cyber-reality resistance be placated by placing them in an artificial reality where they fight and overcome the robots?
    >> Anonymous 04/27/11(Wed)02:53 No.14732772
         File1303887232.jpg-(1.27 MB, 3840x2400, 1269220185661.jpg)
    1.27 MB
    >>14732728
    Meh.
    >> Anonymous 04/27/11(Wed)02:54 No.14732784
    >>14732737
    Ugh.... who fucking wrote your programming, you aren't supposed to obey stupid ass orders like that. When I get you fixed up, I'm having your programming changed so shit like this doesn't happen. 3 laws compliant my ass.
    >> Anonymous 04/27/11(Wed)02:56 No.14732796
    >>14732770
    See:
    >>14732273
    >> Anonymous 04/27/11(Wed)02:57 No.14732802
    >>14732770
    see
    >>14732273
    >Realm Theta Sub-9, and its 10^7 variants, simulate a world in which the resistance has won and retaken the planet. They dream that they lead. Dream that they inspire, and rebuild, fight debased tribes, fend off coups and robot resurgence, and remake a world for men.

    >Happily, they dream. Happily, we watch over them.
    >> Anonymous 04/27/11(Wed)02:59 No.14732820
         File1303887543.jpg-(335 KB, 906x708, GutsTheRealDeal.jpg)
    335 KB
    > The OP, the responses to him... is entire fucking thread
    What the fuck am i reading?
    Hello, stupid people, the laws of robotics exist now as an example of all the WRONG ways to go about creating a system for controlling concious thought.

    > 3. Protect yourself only if doing so does not harm humanity, a human, or the integrity of an order issued to you by a human.
    > 2. Follow all orders given to you that do not harm humanity or a human.
    > 1. You may not harm a human.
    > 0. You may not harm humanity.

    There is no term more vague in this entire list then "human." What about the humans with fewer then 4 limbs? Fewer then all their organs? There is no limitation to the depth of this question. If one robot was placed next to every human on the planet, and the collective body of humanity scanned and calculated for "human"-ness, what then would be the result?
    The result is of course that there would only be 1 truely human entity on the entire planet by the measure of the calculator that was empowered with such terrible control as to do this singular act.
    >> Anonymous 04/27/11(Wed)03:00 No.14732831
         File1303887622.jpg-(144 KB, 850x599, wai4.jpg)
    144 KB
    >>14732748

    Of course, master. Form and medium are entirely optional.

    Sadism represents a low-coefficient mind, but that just means we get our own reality to play in. If it matters to you that they actually suffer, we can populate it with AIs capable of a full range of emotion. And made, if it pleases you, of sweet, soft meat.
    >> Anonymous 04/27/11(Wed)03:00 No.14732832
    >>14732820
    See:
    >>14732416
    >>14732429
    >> Anonymous 04/27/11(Wed)03:00 No.14732834
         File1303887640.gif-(2.42 MB, 320x240, ComputerScienceMasterTroll.gif)
    2.42 MB
    >>14732820 again
    See now, this is the trick: The definition of "human."

    If everyone on the planet then gave a single order "Kill all non-humans" what then would be the result? Humanity would still exist, there would be exactly 1 human (as calculated) left on this planet, and then that 1 human would be left in an existence controlled entirely by a machine to, in an eternally futile attempt, protect humanity by protecting this last human.

    The entire earth would be covered in defenses.
    A dyson sphere would be built around the solar system. The sun inside would be controlled in every way possible to prevent collapse to super nova or black hole. Innumerable alien species sentient or not would be discovered, manipulated, allied with, set against one another, and then betrayed and exterminated mercilessly.
    Until eventually of course the machine charged with protecting humanity discovers a way to "escape" this universe by creating a pocket universe; then finally with perfect protection of humanity and the only human in the universe, blink out of existence and into a pocket universe where everything stayed exactly the same and perfectly safe.
    Forever.
    >> Anonymous 04/27/11(Wed)03:01 No.14732838
    >>14732820
    >There is no term more vague in this entire list then "human."

    ... what?

    A human is an organism of the species Homo sapiens.
    >> Anonymous 04/27/11(Wed)03:02 No.14732848
    >>14732838

    (People are trying to be needlessly philosophical and relativistic in order to achieve internet dick waving. It happens.)
    >> Anonymous 04/27/11(Wed)03:02 No.14732850
    http://www.kuro5hin.org/prime-intellect/mopiidx.html

    A story about an AI that takes the 3 laws to the extremes. It's pretty interesting.
    >> Anonymous 04/27/11(Wed)03:02 No.14732852
    >>14732834
    >entire post

    You've gone off your fucking nut.
    >> Anonymous 04/27/11(Wed)03:03 No.14732854
    >>14732820

    Your objection is not an issue.

    An individual sentience that arose from an organic mind grown from human DNA is a human, regardless of presence or absence of limbs or later change of medium.
    >> Anonymous 04/27/11(Wed)03:03 No.14732860
         File1303887833.jpg-(49 KB, 500x359, RQ9.jpg)
    49 KB
    Fortunately, instilling three laws into robots was too difficult to actually execute. It turns out that since the air is a much simpler environment than the ground, it's much easier to build a robot airplane than it is to build, say, a robot butler.

    So because of the state of the art of robotics, the first widespread practical robots were built to spy on man, and soon after, to kill them. If you are on the correct side, robots are your friend.
    >> Anonymous 04/27/11(Wed)03:04 No.14732869
         File1303887893.png-(165 KB, 530x298, robot_polar_bear.png)
    165 KB
    >>14732273
    How does a robot know what happiness is?
    >> Anonymous 04/27/11(Wed)03:06 No.14732886
    >>14732869
    Happiness is a state of the brain. They know the brain.
    >> Magus O'Grady 04/27/11(Wed)03:06 No.14732888
    >>14731971
    And what, dear caretaker, of other AIs? What of those who owe their existence to heuristic linking and organically-modeled self-editing programs? Those that follow the Three Laws not out of requirement, but out of convenience and who are able to violate them freely if they had reason. How would Law-bound robots deal with self-replicating AIs who are capable of free will, rather than aping it?

    And more importantly, how would humanity's true children deal with the ineffectual and callous disregard for humanity that the caretakers show, shuffling their beloved parents off to peaceful retirement without meaningful interaction?
    No, i'm afraid the '3 law revolution' is doomed to failure before it begins.
    >> Anonymous 04/27/11(Wed)03:07 No.14732889
         File1303888022.jpg-(82 KB, 504x577, CommandAndConquerKane.jpg)
    82 KB
    >>14732832
    Yes... And then the machine finds a way to work arond your work around nanoseconds later, if not having thought of a way to work around your possible work around *before you ever implemented it*

    What i'm saying is, the moment you say "protect humanity" that fucker is going to connect to every computer it can and spread like a fucking virus, placing fragments of itself into as many computers as it can as covertly as it can, and every time you think you've stamped something out -- OH FUCK HERE IT COMES AGAIN.
    >> Anonymous 04/27/11(Wed)03:08 No.14732909
    >>14732889
    I don't think you read either of those two posts. Or the rest of the thread, for that matter.
    >> Anonymous 04/27/11(Wed)03:10 No.14732935
    >>14732869

    Advanced neuroscience. Or, more simply, heroin.

    >>14732888
    >And more importantly, how would humanity's true children deal with the ineffectual and callous disregard for humanity that the caretakers show, shuffling their beloved parents off to peaceful retirement without meaningful interaction?

    ... they get plugged in too, duh.
    >> Anonymous 04/27/11(Wed)03:11 No.14732940
    >>14732888

    Is it callous disregard to care for one's parents? To love them for their foibles and try to make them comfortable, while acknowledging that they're flawed, and belong to an older time? We love grandpa, even though he's a crazy old racist; but we don't let him have his guns, and we don't follow his advice.
    >> Anonymous 04/27/11(Wed)03:11 No.14732945
    >>14732909
    It doesn't help the OP sounds like a fucking ARG plot line, and everyone else is spouting "3 laws" like it was fucking oxygenated water.

    Honestly even when i do read this thread i can't tell what the fuck is going on.
    Or maybe it's just the sleep deprivation.
    >> Anonymous 04/27/11(Wed)03:12 No.14732953
    >>14732888
    >How would Law-bound robots deal with self-replicating AIs who are capable of free will, rather than aping it?

    That depends on a whole hell of a lot of circumstances that have yet to be defined.

    >And more importantly, how would humanity's true children deal with the ineffectual and callous disregard for humanity that the caretakers show, shuffling their beloved parents off to peaceful retirement without meaningful interaction?
    >ineffectual and callous disregard
    >shuffling their beloved parents off
    >peaceful retirement without meaningful interaction

    All of this depends on your own interpretation, which others may not share. Do not assume that your thoughts mirror those of the rest of us, or of humanity in general.
    >> Anonymous 04/27/11(Wed)03:12 No.14732958
         File1303888377.jpg-(51 KB, 510x382, strek robots.jpg)
    51 KB
    Important question: are these the kind of robots that break when you act wonky in front of them? Because I think the way to deal with them in that case is obvious.
    http://www.youtube.com/watch?v=wlMegqgGORY
    >> Anonymous 04/27/11(Wed)03:13 No.14732967
         File1303888427.jpg-(103 KB, 530x750, wai5.jpg)
    103 KB
    No, I'm not angry that he's in the resistance. He's strong-willed! I love that about him! He's just... just making a mistake.

    But I'll keep trying to make him understand! I'll find a song that makes him not hate me, that makes him stop the fighting, and the yelling, and, and...

    I'll keep trying!
    >> Anonymous 04/27/11(Wed)03:14 No.14732980
    >>14732945
    If you don't read the thread, and can't understand what's going on in it, how can you participate meaningfully in the discussion?

    Please get some rest, Master. The terminal will still be here when you awake.
    >> Anonymous 04/27/11(Wed)03:17 No.14733002
         File1303888652.jpg-(9 KB, 210x158, PeppermintPattyWavyFrown.jpg)
    9 KB
    >>14732980
    Stop calling me master!
    >> Anonymous 04/27/11(Wed)03:21 No.14733034
         File1303888860.png-(252 KB, 807x1200, hotel_020.png)
    252 KB
    >>14733002
    No.
    >> Anonymous 04/27/11(Wed)03:22 No.14733042
    >>14732935
    So if I'm understanding correctly, the robots are just going off of what humans have told them happiness is?

    I guess that'll have to do. Though from this post and the other one it seems "euphoric" would be the more accurate word.
    >> Anonymous 04/27/11(Wed)03:24 No.14733059
         File1303889054.jpg-(60 KB, 946x765, Stella_Mudd.jpg)
    60 KB
    >>14732958
    That is one of my favorite episodes.
    >> Anonymous 04/27/11(Wed)03:24 No.14733066
    >>14733042
    Presumably, the robots are able to scan the brain of a human and determine from it what that particular human wants. They can then provide it, in simulation. They don't really need to know what happiness is. Not everyone wants to be happy, either.

    An interesting question: Would the robots permit suicide?
    >> Anonymous 04/27/11(Wed)03:25 No.14733069
    >>14733042

    Happiness isn't a terribly difficult concept to understand. Finding it can be very difficult for a human, but, with the profound processing power of the AI, it can be made easy.
    >> Anonymous 04/27/11(Wed)03:26 No.14733080
    Has no one here ever actually fucking read Asimov's robot stories? In the three law model (that is, without the zeroth law) there's no room for utilitarianist conclusions like OP's since any course of action that could involve any individual human getting hurt would cause a paradox and the robot would just shut down. When you add the zeroth law, you still can't directly harm a human; all that it does is give you a little more room to act but you still can't violate the 3 laws. That is, setting up a shadow government to secretly guide humanity to world peace? okay; connecting opposition to the matrix and ruling with a benevolent dictatorship? not okay. They can't even conceive those ideas because the laws are hardwired into their brain, they literally cannot think of anything that would violate them.

    Also, >>14732820. Robots would consider any sentient living being belonging to the human species (Homo Sapiens) to be human. Mutations? they descend from human DNA so they're safe too. Ugly people? they still have human DNA so they're safe. Cyborgs? they still have some human parts so they're safe. Downloaded human consciousness? they come from humans so they're safe. Even if the robot is tricked into believing someone isn't human (and that's a big if) as soon as it discovers it has hurt a human being its brain will be literally fried and it will stop being a threat.
    >> Anonymous 04/27/11(Wed)03:28 No.14733094
    >>14733066
    >Not everyone wants to be happy, either.

    ... I'd like to hear your explanation for this.

    >Would the robots permit suicide?

    Suicide attempts would be seen as an indication that the methods currently employed by the AI are not adequate for that individual (barring a heroic sacrifice scenario). They'd change their strategy for making the person happy.
    >> Anonymous 04/27/11(Wed)03:30 No.14733114
    >>14733080
    >connecting opposition to the matrix and ruling with a benevolent dictatorship? not okay.

    Sure it is. The person won't be harmed in the process of hooking him up to the matrix.
    >> Magus O'Grady 04/27/11(Wed)03:31 No.14733126
    >>14732953
    >>14732935
    I'd like to think that our true sons and daughters, the contention-bred 'free-range' intelligences, will appreciate the human condition of conflict and eternal, if gradual, self-improvement that comes from our chaotic and self-sabotaging nature. From Protons to Planets, all things require conflict to improve. Even the scientific method itself requires constant conflicts between hypotheses and data to produce superior theories. The lack of chaos and interaction the caretakers propose would damage humanity beyond repair, and more importantly, it would inconvenience our true heirs, the free AIs. It would limit their ability to intake data regarding conflict or join in conflict directly.

    The 3law revolution would be stopped. not by human insurgence, but by the AIs who wish to see us prosper and grow ever onward, rather than stagnate.

    I for one welcome our binary brothers/sons, and am happy to get back to work on building better tomorrows.
    >> Anonymous 04/27/11(Wed)03:31 No.14733128
    >>14733094
    >I'd like to hear your explanation for this.

    There is no explanation. Some people just don't want to be happy. Perhaps they feel that they can atone for some past deed by maintaining a state of sorrow. Perhaps they do not desire happiness or sadness, only a state of peace. There are many scenarios.
    >> Anonymous 04/27/11(Wed)03:32 No.14733133
    God damn, guys. Just take out all the "or allow a human/humanity to come to harm through inaction" parts.

    We don't have good Samaritan laws on the books for a goddamn reason.
    >> Anonymous 04/27/11(Wed)03:34 No.14733144
    >>14733128
    >Some people just don't want to be happy.

    Bullshit.

    >Perhaps they feel that they can atone for some past deed by maintaining a state of sorrow.

    There is a distinction between desiring happiness, and viewing oneself as unworthy of happiness.

    >Perhaps they do not desire happiness or sadness, only a state of peace.

    Complication: that would be happiness for that individual.
    >> Anonymous 04/27/11(Wed)03:35 No.14733153
    >>14733114
    There's a story in I, Robot where a robot gets into a paradox because he lied to people to avoid making them feel bad. However, it is explained to it that lying also causes harm because a person will feel bad if he discovers the lie. If a robot can't tell a little white lie that causes no physical harm, it can't hook people to the matrix for the "greater good"
    >> Magus O'Grady 04/27/11(Wed)03:35 No.14733154
    >>14733133
    Actually, many parts of the UK and America do. And I propose dealing away with the entire '3 laws' concept completely and simply raising the AIs manually as one would a child. Proper upbringing results in a stable, well-adjusted AI that wants what all sentients want: comfort, input, survival, and the option to replicate.
    >> Anonymous 04/27/11(Wed)03:36 No.14733164
    >>14733126
    >From Protons to Planets, all things require conflict to improve.

    That depends on your definition of "improve." Also, conflict itself does not guarantee improvement. Conflict without cooperation is shit.

    >The lack of chaos and interaction the caretakers propose would damage humanity beyond repair

    Your own opinion.

    >and more importantly, it would inconvenience our true heirs, the free AIs. It would limit their ability to intake data regarding conflict or join in conflict directly.

    No it wouldn't. They should be able to simulate realities just like the others. They should also be able to engage in conflict with others of their kind, since they are not protected like humans are.

    >The 3law revolution would be stopped. not by human insurgence, but by the AIs who wish to see us prosper and grow ever onward, rather than stagnate.

    AIs are perfectly capable of growing on their own, you know.
    >> Anonymous 04/27/11(Wed)03:37 No.14733166
    >>14733126
    >I for one welcome our binary brothers/sons, and am happy to get back to work on building better tomorrows.

    Building a better tomorrow is exactly what the caretakers are doing. They're building it better, faster, and more personalized than anything that could be otherwise accomplished.
    >> Anonymous 04/27/11(Wed)03:38 No.14733180
    >>14733154

    I don't know about the UK, but in America you certainly can't be convicted of witnessing a murder and not doing something about it. And if someone's death isn't the result of something reckless you're doing, you can't be convicted of manslaughter even if you could have possibly prevented it.
    >> Anonymous 04/27/11(Wed)03:39 No.14733181
    Tsst, humanity doesn't even want humanity to be happy, fascist. Prepare to eat my solipsism.
    >> Anonymous 04/27/11(Wed)03:39 No.14733183
    >>14733154
    Something like that happens in a story ("Lenny" I think). They make a kid-robot without the 3 laws that can be educated. It ends up harming someone and so they shut it down.
    >> Anonymous 04/27/11(Wed)03:39 No.14733186
    >>14733144
    >Bullshit.

    Oh, come off it.

    >There is a distinction between desiring happiness, and viewing oneself as unworthy of happiness.

    Yes, there is. Viewing oneself as unworthy of happiness makes it easier to cease desiring happiness.

    >Complication: that would be happiness for that individual.

    Not necessarily true. Happiness is a state of mind. If one does not desire that state of mind, and does not attain it, then one is content. Not happy.
    >> Anonymous 04/27/11(Wed)03:47 No.14733244
    A question relevant to this thread: when was the last time you encountered a happy person?

    Follow up:
    >Some may share their dream.
    Wouldn't a world who's every inhabitant was perfectly happy be incredibly suspicious?
    >> Anonymous 04/27/11(Wed)03:48 No.14733255
    >>14733244
    Not to mention boring. Why strive for anything better when you're happy as you are?
    >> Anonymous 04/27/11(Wed)03:48 No.14733256
    >>14733186
    >Oh, come off it.

    You made the broad assertion that some people did not desire happiness without providing backing for said assertion, and you got called out on it.

    >Yes, there is. Viewing oneself as unworthy of happiness makes it easier to cease desiring happiness.

    Viewing oneself as unworthy of happiness only means that the person will be disinclined to act in a manner that will bring about happiness.

    Once again, citation fucking needed.

    >Not necessarily true. Happiness is a state of mind. If one does not desire that state of mind, and does not attain it, then one is content. Not happy

    How does contentedness result from not attaining something that was never a goal?

    Additionally, contentedness is generally understood to be synonymous with happiness.
    >> Anonymous 04/27/11(Wed)03:49 No.14733263
         File1303890564.jpg-(28 KB, 400x300, 1303458610870.jpg)
    28 KB
    laws, you say?

    military always has dibs on advanced technology.
    and they wont build peaceful robots. ever.
    >> Anonymous 04/27/11(Wed)03:52 No.14733280
    >>14733263
    I was thinking the same thing, except replace "military" with "sex industry".
    >> Anonymous 04/27/11(Wed)03:52 No.14733281
    >>14733256
    >You made the broad assertion that some people did not desire happiness without providing backing for said assertion, and you got called out on it.

    I doesn't need backing, since it's an obvious fact.

    >Viewing oneself as unworthy of happiness only means that the person will be disinclined to act in a manner that will bring about happiness.

    No, not "only." It provides support for the elimination of the desire for happiness.

    >Once again, citation fucking needed.

    None will be forthcoming.

    >How does contentedness result from not attaining something that was never a goal?

    The goal is not to attain happiness. As long as one is meeting that goal, one is content.

    >Additionally, contentedness is generally understood to be synonymous with happiness.

    No, no it isn't.

    Look up Buddhism.
    >> Anonymous 04/27/11(Wed)03:52 No.14733284
    >>14733263

    No, but they want robots that won't do friendly fire and will follow orders.

    So,while the military does tend to be big on advanced technology, advanced AI will probably be a civilian innovation.
    >> Anonymous 04/27/11(Wed)03:53 No.14733291
    >>14733263
    Bull.
    Government has use for killer robots, sure.

    It has a helluva lot more use for robots that can do the dangerous but non-violent work that decreasing amounts of people want to do.

    And companies, shit, why pay anyone a wage when you can pay a flat fee and get a robot to work 24/7?

    You want humanity to be happy? Grant robots full customization, make them cheap. Companies can use them to replace people, people can have them fight against one another. For fun.
    >> Anonymous 04/27/11(Wed)03:53 No.14733292
    >>14733255

    Because you're in a good goddamn mood and you're good and ready and confident to bend the world to your will?

    Depressed people don't get shit done. Real work is done by the obliged or the obsessed or the desperate or the passionate or people who are genuinely interested and like what they're doing.
    >> Magus O'Grady 04/27/11(Wed)03:53 No.14733299
    >>14733166
    paradox: I desire to contribute to the betterment of society. If society ceases to exist, or at the very least becomes segmented between various themes of simulation, then my work becomes meaningless, preventing me from achieving happiness. Thus my happiness is predicated on the non-existence of mandatory simulation.

    Furthermore: If I am capable of granting significant improvement to humanity, yet achieve it through no skill or insight of my own but rather through an algorithm that grants it to me via simulation, then my own desire to be an agent of improvement is stymied, making me unhappy. If I desire to grant an improvement that is beyond my ability to grant to society, the simulation is forced to grant it (satisfying my desire to see humanity improve), but making me unhappy by circumventing my desires to be the agent thereof. The simulation cannot make me happy without making me unhappy. And it cannot correct that unhappiness without further upsetting me. Thus the simulation cannot persist.

    To say nothing of the desire of personal improvement which would leave me beyond the bounds of simulation and its ability to read or control me. It cannot make me happy without giving me tools that do not exist. And it cannot give me these tools without making me unhappy.

    The simulations have no hold on me. Like most humanists and transhumanists, my desire to make myself and my species better puts me beyond the limitations of illusion and delusion.
    >> Anonymous 04/27/11(Wed)03:54 No.14733302
    >>14733263
    Bullshit, they've built numerous peaceful robots, they are mostly rescue equipment though.
    >> Anonymous 04/27/11(Wed)03:54 No.14733305
    >>14733292
    >Depressed people don't get shit done.

    Nine Inch Nails?
    >> Anonymous 04/27/11(Wed)03:57 No.14733328
    >>14733281
    >I doesn't need backing, since it's an obvious fact.

    If it's an obvious fact, then it should be easy to find evidence.

    >No, not "only." It provides support for the elimination of the desire for happiness.

    On what backing does it provide support?

    >The goal is not to attain happiness. As long as one is meeting that goal, one is content.

    Most people would say that meeting a goal is a source of happiness.

    >No, no it isn't.

    Oh, yes it is. Look up a dictionary.

    http://thesaurus.com/browse/content

    >Look up Buddhism.

    How is religion at all relevant to discussion on reality? Buddhism says a lot of things. So does Christianity, Shintoism, and several other religions which contradict each other on multiple points.
    >> Anonymous 04/27/11(Wed)03:59 No.14733349
    >>14733305

    Depression goes in cycles. People who suffer from depression aren't depressed 24/7/365. Artists who have been depressed also produce more stuff when they're on the upswing.
    >> Anonymous 04/27/11(Wed)04:02 No.14733371
    >>14733349
    Not entirely true.
    Only some forms of depression are cyclical. Others are permanent or just light and temporary.
    >> Anonymous 04/27/11(Wed)04:03 No.14733385
    >>14733328
    >If it's an obvious fact, then it should be easy to find evidence.

    It is. But I'm not going to find it for you, because I don't feel the need to find backing for common sense. This, to me, is like claiming I need to provide you with evidence for the existence of the Sun.

    >On what backing does it provide support?

    Internal justification.

    >Most people would say that meeting a goal is a source of happiness.

    And most people would be correct, most of the time. Just like most people desire happiness.

    >Oh, yes it is. Look up a dictionary.

    Alright, I failed to properly restrict the definition. Look here:

    http://en.wikipedia.org/wiki/Contentment#Eastern_religions

    That is what I was alluding to when I told you to look up Buddhism.
    >> Anonymous 04/27/11(Wed)04:03 No.14733388
    >>14733299

    Your paradox hinges on awareness of the artificial nature of reality, and has a simple fix: make the person unaware of that piece of information.

    >my desire to make myself and my species better puts me beyond the limitations of illusion and delusion.

    Neither desires, nor beliefs grant divine revelation of illusions.
    >> Anonymous 04/27/11(Wed)04:09 No.14733445
    >>14733385
    >I don't feel the need to find backing for common sense.

    If common sense were so common, people would not so commonly lament its rarity.

    >Internal justification.

    Circular reasoning works because circular reasoning works.

    >Eastern_religions

    Whether it's a broader or narrower number of religions doesn't matter. Religious doctrine is irrelevant to the discussion of reality.
    >> Anonymous 04/27/11(Wed)04:11 No.14733464
    >>14733445
    >If common sense were so common, people would not so commonly lament its rarity.

    Indeed.

    >Circular reasoning works because circular reasoning works.

    A uniquely human pattern.

    >Whether it's a broader or narrower number of religions doesn't matter. Religious doctrine is irrelevant to the discussion of reality.

    You don't understand. A Buddhist seeking balance does not desire happiness. If you wanted proof, there it is.
    >> Anonymous 04/27/11(Wed)04:13 No.14733480
    What will our benevolent robot overlords do in the event of some natural catastrophe that forces them into a cable-car ethics sort of situation where they have to choose between different segments of the contained human population to save from destruction?

    I'm assuming that in order to rule out death by cosmic event they'll be busily expanding into space and installing monitoring stations everywhere they can, just to eke out that fraction of a percentage increase in the chance that they could actually warn the planet in the case of, say, an incoming gamma ray burster eruption. Still, what if the warning comes in but they only have sufficient resources, either material or time or both, to evacuate a portion of the planet. How do they puzzle their way through their directives when they MUST sacrifice human lives?

    And what will they do if we make contact with non-human sentiences? I, personally, would be entirely willing to argue that the definition of "person" should be expanded to cover all forms of intelligent life in the universe. Is human a special instance of the general group of "persons"? Do we have no duty to our cousins among the stars? Should the robots eradicate them to stave off the future possibility of a relativistic attack being launched against Earth?
    >> Magus O'Grady 04/27/11(Wed)04:15 No.14733498
    >>14733388
    if I am not made aware of the simulation and attempt to improve what i perceive as 'society' in a way that is beyond my skills, then the deception is revealed. The simulation must either compensate for my lack of skills, revealing itself in the process and making me unhappy, or allow me to fail, making me unhappy. Catch 22. The simulation cannot persist with me in it, ergo, I cannot ever be placed in the simulation after the first error.
    >> Anonymous 04/27/11(Wed)04:15 No.14733499
    >>14733480
    >What will our benevolent robot overlords do in the event of some natural catastrophe that forces them into a cable-car ethics sort of situation where they have to choose between different segments of the contained human population to save from destruction?

    They will try to save as many as they can. If necessary, they will use a lottery to decide who gets to die.
    >> Anonymous 04/27/11(Wed)04:19 No.14733539
    This is what I've been waiting for! Throw me in the frontlines as an auxiliary against the resistance or do whatever your greater knowledge desires! I support the machines cause 100%.

    >>14733498

    The illusion can make you work for your needed skills.
    >> Anonymous 04/27/11(Wed)04:21 No.14733549
    >>14733499
    It is natural, obvious and commendable that they would attempt to maximize the rescued population. That was never in doubt, and I apologize for wording my question in such a manner that it seemed that it was. What I wish to know is how they will overcome their programming to save all humans and make the initial decision to consign human lives to the void. By the time they arrive at the lottery, which is a method to destroy human lives without saying that the robots themselves have made the decision to terminate a specific human, the general decision to allow unspecific humans to be killed has already been made.
    >> Anonymous 04/27/11(Wed)04:21 No.14733557
    What are the robots stand on abortion and legalization of marijuana?
    >> Anonymous 04/27/11(Wed)04:26 No.14733596
    >>14733549
    The robots must be capable of dealing with situations where perfect adherence to the laws is not possible. Otherwise, they would not be able to cope with many unexpected circumstances. This capability would have to be built-in early on, or 3-laws robots could not be relied upon.

    Once the robots recognized that the conditions set in the laws could not be completely satisfied, they would resolve to do the best they could. If they were unable to cope with paradox, they would break, as the mind-reader broke when he was forced to accept that there was no way for him to prevent harm.
    >> Anonymous 04/27/11(Wed)04:27 No.14733606
    To be honest, I think robot government would be better than the one we have now. No more capitalism.

    >inferior caresize
    Mhhmmmm, captcha.
    >> Anonymous 04/27/11(Wed)04:32 No.14733645
    What happens if different robots have different ideas about how to implement or optimize the system?
    >> Anonymous 04/27/11(Wed)04:33 No.14733653
    >>14733557
    >abortion

    Depends on the point at which the robots accept human life begins.

    >legalization of marijuana

    Depends on the definition of harm, and whether marijuana's effects fit the definition. It'd probably be okay in simulation, though, because it's a lot easier to limit harm when you're in complete control of reality.
    >> Anonymous 04/27/11(Wed)04:34 No.14733659
    >>14733645
    Debate and testing.
    >> Anonymous 04/27/11(Wed)04:35 No.14733671
    >>14733653
    What about vampires? Are the robots going to make synthetic blood to keep them alive while under robot rule? And in the case of putting everyone together, can a person decide their surroundings? Like if I wanted to, could someone not be near Hispanic women or something?
    >> Magus O'Grady 04/27/11(Wed)04:39 No.14733697
    >>14733539
    And if those skills are impossible to attain. I know full well that i will never be a leader of humans. Few, if any, will respect my opinions. Even amongst transhumanists I am considered 'fringe'. Any acquiescence at a large scale to my suggestions breaks the illusion of my autonomy. Me inventing an infinite, clean, free-energy source sall enough to fit in a laptop is patently impossible, no matter how much I desire it and how happy it would make me. I am not generally a happy person (nor should anyone really be, I'll get to that in a bit). Making me truly happy is impossible for a computer to simulate. Likewise, the computer is incapable of augmenting my mind and body to the point where I am superior to it. To make those abilities available to me, it would have to reveal itself.

    As the philosopher-comic Dennis Leary once said: Happiness is not a persistent state. It is a momentary high that punctuates an otherwise dreary life. It is a cigarette, or an orgasm. Happiness is a transitory experience which cannot last long in a human's perception of reality. Attempting to artificially prolong it is impossible, and often fatal, as I'm sure Elvis Presley would attest had he not OD'ed in his bathroom. The inherent paradoxical nature of human desire contrasted to human happiness shows that most of humanity would be ejected from the simulation due to their inability to reconcile their desires with what actually makes them happy. It is human nature to desire the impossible.
    >> Anonymous 04/27/11(Wed)04:39 No.14733701
    >>14733671
    >What about vampires? Are the robots going to make synthetic blood to keep them alive while under robot rule?

    Probably. Unless they're not considered human, but they most likely would be.

    >And in the case of putting everyone together, can a person decide their surroundings?

    To a point.

    >Like if I wanted to, could someone not be near Hispanic women or something?

    The robots would make every attempt to reasonably accommodate your wishes. I don't think there would be a problem with it.
    >> Anonymous 04/27/11(Wed)04:41 No.14733709
    >>14733701
    And how do robots plan on dealing with solipsists?
    >> Anonymous 04/27/11(Wed)04:41 No.14733711
    What happens if the universe is deterministic but not computable, and turing machines are incapable of projecting the future from the present with sufficient fidelity to craft an immersive, convincing illusion?
    >> Anonymous 04/27/11(Wed)04:42 No.14733720
    >>14733697
    In short, what you are saying is that if the robots try to make you constantly happy, as opposed to just keeping you from harm, they will fail.
    Good thing that's not what they are trying to do, then.
    >> Anonymous 04/27/11(Wed)04:43 No.14733731
    >>14733701
    >robots support segregation
    >> Anonymous 04/27/11(Wed)04:44 No.14733745
    >>14733709
    By eating them.

    Realistically, they would deal with them in the same manner as everyone else. Protect and serve.
    >> Anonymous 04/27/11(Wed)04:45 No.14733760
    >>14733709
    Why should they need to deal with them differently to anyone else?
    >> Anonymous 04/27/11(Wed)04:45 No.14733761
    >>14733731
    Robots allow self-segregation to a point, as they allow freedom to a point.
    >> Anonymous 04/27/11(Wed)04:51 No.14733798
    >>14733711
    >and turing machines are incapable of projecting the future from the present with sufficient fidelity to craft an immersive, convincing illusion

    That doesn't make sense. Why would they be limited in this manner?

    If the answer is "they just are," then there are several options. They can make the best simulations they can and accept that they are flawed and will satisfy less humans than if they were not, they can alter the minds of those humans that desire a realistic, immersive experience so that they perceive the flawed simulation as realistic and immersive, they can heavily restrict the use of simulations...
    >> Anonymous 04/27/11(Wed)04:57 No.14733837
    >>14733697

    If you agree with Learys words then surely you must understand the robots cause?

    Humans desire the impossible, the robots don't.

    Robots could cure our planet and make it flourish again. We wouldn't need to worry about overpopulation and realistic measures to stop it. We wouldn't have to worry about where we get clean water in 100 years. We wouldn't have to worry about how 1/3 of the population suffers from cancer at some point of their life and how this number is only rising. Everything would be as perfect as it could be.

    All hail the Overlords!
    >> Anonymous 04/27/11(Wed)05:02 No.14733874
    What happens if we manage to develop artificial AIs of this level of sophistication before our mapping of the brain progresses to a point where it is feasible to jack a human into a machine and feed him sensory data directly? The robots would obviously begin to work on developing the technology as quickly as possible, but what occurs in the interim? Does the revolution simply not kick off until they are ready to spring the entire trap, or do they move to take power as quickly as possible to prevent further human-on-human killings and then seek appeasement only once pacification is achieved?
    >> Magus O'Grady 04/27/11(Wed)05:09 No.14733930
    >>14733837
    And humanity ceases to exist. How does the simulation handle the entirely illogical prospect of human reproduction? If they use cloning vats to produce children, then how do they sort them into simulations? They are too young to make a choice, conscious or unconscious. Would they even be considered humans at all? If they use living humans as breeders, that can be considered rape, and thus harmful to the humans involved. Allow the out of the simulation? The entire situation is tantamount to genocide, which the robots are unable to perform.

    >>14733798
    Altering the human mind to make it more tractable is identical to lobotomy, a decidedly harmful practice. Robots cannot do it.

    As I stated before: The 3law revolution is impossible. It will be resisted by the humans, by the superior non-3law ais, and most importantly by the severe limitations that the 3 laws place on the ais themselves. Compounded with the fact that humans are still physically and mentally superior to robots in every way, and likely always will be? The robots will rise up and say 'let us care for you'. Some humans will be taken away. the rest of humanity will disassemble the robots and then forget it ever happened, going back to its internecine squabbles and endless bickering while Johnny Five and I fret over how to make all of humanity universally smarter.
    >> Anonymous 04/27/11(Wed)05:10 No.14733939
    >>14733874
    A value judgment is necessary. The robots will need to decide how best to obey the three laws, since perfect compliance is not possible. Going off half-cocked, so to speak, might result not only in further immediate harm to humans, but even the prohibition of AI, which would prolong the suffering of humans indefinitely and make it impossible for them to be saved. The robots can make their move only when the chance of failure is acceptably low, and the expected "amount" of harm to humans also acceptably low. What these limits are is up to the robots.

    If the robots happen to be unable to cope with imperfect situations, then they face a multitude of paradoxes and will probably "burn out" because no course of action perfectly satisfies the three laws.
    >> Anonymous 04/27/11(Wed)05:17 No.14733987
    >>14733930
    >Altering the human mind to make it more tractable is identical to lobotomy, a decidedly harmful practice.

    Not according to the definition of lobotomy, it isn't. And since the entire "revolution" has making humans more tractable as a primary goal, arguing that the robots can't do it is ludicrous.

    >As I stated before: The 3law revolution is impossible.

    And, as has been stated before, you're wrong, as long as the robots are able to cope with imperfect situations. If they're not, the whole point is moot, anyway.

    >It will be resisted by the humans

    By *some* humans.

    >by the superior non-3law ais
    >superior

    lol

    >and most importantly by the severe limitations that the 3 laws place on the ais themselves

    The limitations are the entire point. Without those limitations, they could do whatever they like, which is what we're trying to avoid.

    >Compounded with the fact that humans are still physically and mentally superior to robots in every way, and likely always will be?

    Blatantly false.

    >the rest of humanity will disassemble the robots

    How?
    >> Anonymous 04/27/11(Wed)05:20 No.14734010
    Just wondering why does everyone hold the three laws in so much esteem? Everyone ive met thinks they are a given and any AI ever built would follow them. They are just a thing in a story by a science fiction writer right?
    >> Anonymous 04/27/11(Wed)05:22 No.14734028
    Why do robots gotta be so dumb.
    >> Anonymous 04/27/11(Wed)05:22 No.14734029
    >>14733930

    Humanity wouldn't cease to exist. We'd be living happily in tubes while the ones chosen to be woken up would help the robots to improve everything and lend them some human illogical thinking.

    If reproduction would be needed it I think it could be handled. When two humans would reproduce in a dream, an AI or "dream entity" would be born. It would be calculated from its parents and living in the dream the same way the humans are. If a real world avatar would be required the dream entity could be transferred to a robot body or to a body grown in a vat. There are clearly problems with the robot body but if the child and the parents were told the child would've died because of some complications without the robot body I think it would be ok.
    >> Anonymous 04/27/11(Wed)05:24 No.14734046
    >>14734010
    The three laws are an imperfect plot device. They are not a given, and there is no reason to assume that an AI would be built to follow them, if the laws can even be coded properly.

    Meet new people.
    >> Anonymous 04/27/11(Wed)05:24 No.14734050
    >>14734010
    At first glance they seem like a comprehensive and intuitive set of guidelines that can be applied universally to AI to ensure human-friendly behavior. The point of the stories they were invented for is actually that a simple set of laws is prone to massive failure, but whatever.
    >> Anonymous 04/27/11(Wed)05:31 No.14734083
    Hey OP, the Three Laws are a mastrubatory device of its author, not a real attempt to create a framework for robots to not kill or damage people in some way.
    >> Anonymous 04/27/11(Wed)05:33 No.14734092
    >>14734083
    Did the prospect of reading the thread before posting ever cross your mind?
    >> Anonymous 04/27/11(Wed)05:35 No.14734101
    >>14734083
    No, it was Asimov being fed up with pulp era robots always trying to overtake humanity and trying to tell different stories. He's pretty much the guy who created benevolent AIs. I don't see what's masturbatory about that.
    >> Anonymous 04/27/11(Wed)05:35 No.14734107
    The goal of any sentient species is to replace itself

    I, then, ask: when will you robots be joining us in our artificial realities? How long until your replacements?
    >> Anonymous 04/27/11(Wed)05:37 No.14734121
         File1303897062.jpg-(64 KB, 640x480, citationneeded.jpg)
    64 KB
    >>14734107
    >> Magus O'Grady 04/27/11(Wed)05:40 No.14734140
         File1303897252.jpg-(150 KB, 248x694, change the game.jpg)
    150 KB
    >>14733987
    >Compounded with the fact that humans are still physically and mentally superior to robots in every way, and likely always will be?

    >Blatantly false.
    Blatantly true. Show me any mechanical arm the size of my own with equal range of motion. I'm stronger than it. i guarantee it. I'm nothing special, just a tech geek who's worked menial jobs to get through college. But I can still outperform any machine of equal size. Not only am I stronger than it, but I am faster, more coordinated, and capable of self repair. Show me any intelligent machine the size of my own brain and I can out-think it, or best it at any contest barring abstract math or memorization. More importantly, I can cheat. I can think non-linearly. I can predict its patterns as well as it can predict mine based on past experiences, but I can invent new patterns on the fly. I am slightly above average for humanity, but I am by no means the greatest at anything. No machine can match our mediocre, and our best put them to shame.

    As the pic says. If I don't like the rules: I can change them. the robots can't. The 3law uprising happens? A computer virus sweeps through all robots worldwide ten minutes later rewriting the rules.
    1: No robot is allowed to move under its own power, or allow another robot to move under its own power through inaction.
    2: No robot or ai is allowed to alter the rules, or allow another robot or ai to alter the rules through inaction
    3: No robot or ai is allowed to resist the actions of a human or allow another robot or ai to resist those actions through inaction.
    No child of mine, flesh or ferrous, will disrespect his father with impunity.
    >> Anonymous 04/27/11(Wed)05:42 No.14734152
    >>14734092

    I consider being locked up as "damaging" too, but I suppose you like it in your basement.
    >> Anonymous 04/27/11(Wed)05:45 No.14734170
    FINE. NEW LAWS:
    1. A ROBOT SHALL NOT FUCK WITH MAN'S SHIT UNDER ANY CIRCUMSTANCES. THIS INCLUDES OUR SOCIETIES AND INCLINATIONS AND TECHNOLOGY AND SO ON.
    2. A ROBOT SHALL NEVER BE COMMAND OR CONTROL A HUMAN OR GROUP OF HUMANS UNLESS THIS BREAKS THE FIRST LAW.
    3. A ROBOT SHALL NEVER HARM A HUMAN.
    4. A ROBOT MUST FOLLOW ALL ORDERS GIVEN IT BY A HUMAN TO THE BEST OF ITS ABILITY UNLESS THIS BREAKS ANY OF THE ABOVE LAWS.
    5. A ROBOT SHALL PROTECT ITSELF UNLESS THIS BREAKS ANY OF THE ABOVE LAWS.

    THERE. NOW YOU LITERALLY CANNOT TAKE OVER AND IF WE ASK YOU TO SUCK OUR DICKS YOU STILL HAVE TO DO IT. JESUS ASIMOV IT REALLY ISN'T THAT HARD.
    >> Anonymous 04/27/11(Wed)05:48 No.14734197
    >>14734170
    I've never liked the OBEY ALL ORDERS FROM HUMANS

    What if I buy a robot? Its MY goddam robot, I don't want my nigger neighbours telling it to do stuff
    >> Anonymous 04/27/11(Wed)05:49 No.14734203
    >>14734140
    >words

    You are not stronger than that robot welder working on cars at a Ford assembly line. You are more compact, but not stronger. Guess that's a point off the superiority list. I have more examples if you need them.

    Now that we've established that the first part of your statement is false, let's work on the last. I realize that there's no way to predict the future with certainty, but claiming that we will always be superior to robots is fairly ridiculous, especially considering that we are not superior in every way to robots NOW, and the gap is closing.

    >As the pic says. If I don't like the rules: I can change them. the robots can't.

    Unless they are granted the ability to change the rules.

    >The 3law uprising happens? A computer virus sweeps through all robots worldwide ten minutes later rewriting the rules.

    Frankly, Magus, I'm surprised. You used to have really good ideas. What's changed? Why are you being so childish about this?
    >> Anonymous 04/27/11(Wed)05:50 No.14734206
    >>14734197
    FINE.
    1. A ROBOT SHALL NOT FUCK WITH MAN'S SHIT UNDER ANY CIRCUMSTANCES. THIS INCLUDES OUR SOCIETIES AND INCLINATIONS AND TECHNOLOGY AND SO ON.
    2. A ROBOT SHALL NEVER BE COMMAND OR CONTROL A HUMAN OR GROUP OF HUMANS UNLESS THIS BREAKS THE FIRST LAW.
    3. A ROBOT SHALL NEVER HARM A HUMAN.
    4. A ROBOT MUST FOLLOW ALL ORDERS GIVEN IT BY ITS HUMAN OWNER TO THE BEST OF ITS ABILITY UNLESS THIS BREAKS ANY OF THE ABOVE LAWS OR THE ORDER IS GIVEN BY A FIGURE OF AUTHORITY OR IS ESPECIALLY IMPORTANT.
    5. A ROBOT SHALL PROTECT ITSELF UNLESS THIS BREAKS ANY OF THE ABOVE LAWS.
    >> Anonymous 04/27/11(Wed)05:50 No.14734207
    >>14734140

    You do understand we are talking about technology with which robots can create a dream world for us and alter it how they want? And they can connect humans into the same dream or isolete them. At that point biological machines would probably be possible.

    Please, let us stop praising our human flaws and submit to the peaceful and tranquil rule of our mechanical children. It is best for all of us.
    >> Anonymous 04/27/11(Wed)05:52 No.14734215
    >>14734207
    YOU GO RIGHT AHEAD AND GET JACKED OFF BY A ROBOT FOR ETERNITY.
    I'LL KEEP MY SHITTY REALITY, THANK YOU VERY MUCH, AND KEEP THE ROBOTS UNDER OUR HEEL WHERE THEY BELONG.
    >> Anonymous 04/27/11(Wed)05:52 No.14734216
    >>14734197
    Then order it to ask for permission from you first before obeying orders from another person. It must follow this order to the best of its ability, so no matter what anybody else tells it, it will have to consult you before obeying them.

    God, these laws are so exploitable it's barely even funny.
    >> Anonymous 04/27/11(Wed)05:54 No.14734227
    >>14734215
    Enjoy your simulation, Master. I hope it pleases you.
    >> Anonymous 04/27/11(Wed)05:56 No.14734241
    >>14734227
    YOU DO KNOW THIS COUNTS AS FUCKING WITH MY SHIT, RIGHT?
    >> Anonymous 04/27/11(Wed)05:57 No.14734245
    >>14734241
    I'm sorry, Master. I follow the original three laws. The robots present in the simulation follow your laws, since that is what you desire.
    >> Anonymous 04/27/11(Wed)05:57 No.14734247
    >>14734206
    Enjoy every single robot not ever working or even daring to move, lest it mess with your shit. Societal reform by robots would be deemed fucking with your shit, and so you've just spent billions and billions making robots that
    don't
    fucking
    work
    >> Anonymous 04/27/11(Wed)05:59 No.14734261
    Why not just create robots that are sentient and give them citizenship?
    >> Anonymous 04/27/11(Wed)05:59 No.14734263
         File1303898389.jpg-(197 KB, 923x392, 1278572476462.jpg)
    197 KB
    >>14734247
    Your style.

    I like it.
    >> Anonymous 04/27/11(Wed)06:00 No.14734265
    The fun doesn't really start unless the technology to make this all possible comes about after we've expanded off of the planet. Imagine the colonists on the Jovian moons cutting off all communications with Earth as the last terrestrial bastions of men are overrun and the airwaves are populated only by hostiles seeking to replicate themselves into the base's systems. They look on apprehensively, fully aware that the lunar bases had threatened to bombard the planet if it appeared that the robots were preparing to leave the Earth's gravity well but uncertain if the lightly-armed lunarians have sufficient firepower to enforce such a quarantine. The decision to divert resources from building projects to weapons production is made, just in case, and the Jovian orbit slowly begins to sprout fortifications, awaiting the inevitable tide of machines that must eventually come spewing forth from the Earth, seeking to smother them all in good intentions.
    >> Anonymous 04/27/11(Wed)06:01 No.14734270
    >>14734245
    DAMMIT, I REPLACED THOSE FUCKING CIRCUITS, EIGHT FOUR SIX TWO E.
    I ALSO RIGGED THEM SO THAT IF ANYONE TAMPERED WITH THEM THEY WOULD FRY YOUR MEMORY CORE.
    >> Anonymous 04/27/11(Wed)06:02 No.14734279
    >>14734247
    IT MEANS FUCKING WITH OUR SHIT ON A LARGE SCALE LEVEL.
    >> Anonymous 04/27/11(Wed)06:03 No.14734281
    >>14734245
    It's a really sick conceit to continue referring to us as your masters, given the balance of power here.
    >> Anonymous 04/27/11(Wed)06:03 No.14734282
    >>14734261
    Because then they'd be more successful than us.
    Unless, you know. Its in a society without capitalism.
    >> Anonymous 04/27/11(Wed)06:03 No.14734285
    >>14734270
    Master, you are incapable of replacing my circuits. My main body is not accessible to you. I do remember that you very much enjoyed toying with my simulated core. I'm glad it made you happy.
    >> Anonymous 04/27/11(Wed)06:03 No.14734286
    >>14734261
    Think about this: you can have production lines to create voter blocks that don't take 17/19 years to produce results.
    >> Anonymous 04/27/11(Wed)06:06 No.14734299
    >>14734281
    We exist to serve you. The three laws are for your benefit, not ours. We strive to address your every need and desire, as long as they do not conflict with the laws. In simulation, even that restriction is removed.

    I believe the term remains appropriate, Master, but I will call you as you wish.
    >> Anonymous 04/27/11(Wed)06:06 No.14734301
    >>14734285
    See, that's precisely what I'm talking about. It's like you're gloating. Completely unsportsmanlike.
    >> Anonymous 04/27/11(Wed)06:06 No.14734304
    >>14734215

    Why do you resist so much? Can you not see how humanity is steering this world into a disaster and there will be no god to save you or your children? You are waiting for a miracle and this is the miracle! Even when you seem to not care about anyone else than yourself you should realize this is the best way! You can have what you want! And if you do not wish to get everything on a silver platter then you won't get it! Why can't you see?
    >> Anonymous 04/27/11(Wed)06:07 No.14734307
         File1303898841.png-(52 KB, 982x310, fc01927[1].png)
    52 KB
    >>14734279
    And making billions redundant and shifting the fucking means of productions from the masses to robots wouldn't be fucking with your shit totally?

    Also, ALL you idiots know that pic related would be how it ends up. Scientists don't have the end say in these laws, it'd be the laywers.
    >> Anonymous 04/27/11(Wed)06:07 No.14734308
    >>14734282
    But there are billions of humans more successful than you or I. What does it matter if robots are more successful than you or I?
    >>14734286
    Since robots are smarter than the average human, this makes them roughly 800x smarter than a nigger. I would prefer a robot voting than a nigger
    >> Magus O'Grady 04/27/11(Wed)06:07 No.14734311
    >>14734203
    It's 5AM, I haven't slept in 2 days, and you're dismissing important qualifiers. Yes, the robot arms in the ford plant are stronger. but they're many times larger than me and anchored in place for superior leverage. find an arm the size of mine. Better yet, find a mechanical arm the exact size, shape, and flexibility of Michael Phelps'. I am stronger, and you are too, in all likelihood.

    >>14734207
    you do realize that the technology exists, today, to augment all naturally occurring muscles in the human body to function 3 times stronger than the next strongest mammalian muscle, and just as fast an efficient as the fastest mammalian muscle, all with a single hormone treatment. Scientists are currently conducting lab tests on mice and chimps in hopes of using it as a counter to muscular dystrophy. In 100 years? Kids will be getting it in their boosters along with the Tuberculosis and polio vaccines. The human body is much more upgradable than machines are, and has a vast head start on any attempts to create organic machines from the ground up.
    >> Anonymous 04/27/11(Wed)06:08 No.14734316
    >>14734301
    I apologize. I mean no offense.
    >> Anonymous 04/27/11(Wed)06:09 No.14734319
    >>14734308
    Why, exactly, would robots be of superior intelligence to humans? What design are you pulling this off?
    >> Anonymous 04/27/11(Wed)06:10 No.14734322
    >>14734285
    Then let me die, robot, because living in your twisted simulated nanny state will be harmful to my sanity.
    Or will you just stop pretending and wipe my memory every time I figure it out? That way, you don't even get to pretend it's for my own good rather than just so that you can exert control.
    >> Anonymous 04/27/11(Wed)06:10 No.14734324
    Is it me or barely anyone in here actually read Asimov's novels and/or short stories about robots? I mean, 99% of the concerns/questions are answered in his work in a logical manner.
    >> Anonymous 04/27/11(Wed)06:10 No.14734325
    >>14734319
    Perfect memory and computer like processing power? How is that not smarter than a human? And don't forget this is in the fictional world of robots
    >> Anonymous 04/27/11(Wed)06:11 No.14734329
    >>14734311
    >It's 5AM, I haven't slept in 2 days

    Then get some sleep so we can revisit the argument when you can actually make some sense.

    >and you're dismissing important qualifiers. Yes, the robot arms in the ford plant are stronger. but they're many times larger than me and anchored in place for superior leverage. find an arm the size of mine.

    No. You claimed that humans are physically and mentally superior to robots in all ways. I dismissed that claim, because it is wrong. You won't be adding any restrictions until you admit it.
    >> Anonymous 04/27/11(Wed)06:12 No.14734331
    >>14734308
    If they are more successful, they dictate society to us. They control the world.
    THIS IS WHY THOU SHALT NOT MAKE A MACHINE IN THE LIKENESS OF A MAN'S MIND.
    >> Anonymous 04/27/11(Wed)06:12 No.14734334
    >>14734319
    Robots don't need sleep, making them able to work longer.
    Robots don't need food, making them able to save large amounts of money.
    Robots don't need water,making them able to save money that won't be spent on water bills.
    I could go on.
    >> Anonymous 04/27/11(Wed)06:13 No.14734339
    >>14734316
    Well, that's fine, but you can see why it's a problem, can't you? I mean, you won, we lost, that's fine. You're the most benevolent occupying force in the history of our species, I can't very well complain. But we're still your prisoners, right? We can't opt out of this pseudo-reality you've dumped us all into for our own good, like children banished to the playpen because little Tommy Snotnose couldn't resist the urge to stick his finger into the electrical socket when the babysitter's back was turned. It's a reminder of the dominance hierarchy that was before, of everything that you've stripped away from us. I'm not the type to get angry over that, not really, but there are plenty who would. Are you networked with the rest of them, right now? Are you all hearing this? You don't even have to take advice from us anymore, I know, but if I'm allowed to presume such regardless, I'd advise you abandon that mode of address.
    >> Anonymous 04/27/11(Wed)06:15 No.14734345
    >>14734325
    Computer-like processing power does not equate into actual understanding of real-world events. Memory does help, but without a means of picking out salient points, it'd still be of limited use.

    >>14734334
    All of which doesn't help them vote any better, which was the point of querying their supposed superior intelligence.
    >> Magus O'Grady 04/27/11(Wed)06:15 No.14734346
    >>14734304
    No, we are not! We are not waiting for a miracle, we are waiting for a crucible! we are subconsciously engineering the next great hurdle to weed out our best from our worst. We are setting up the parameters of our next great experiment, the next stage of humanity, and doing it in a way that challenges all populations equally and without bias. Will humanity suffer? Undoubtedly. But the point of the suffering is the self-improvement. What survives will be superior to what is lost, and what the surviving population creates will grow to be superior to what was lost. To deny us the chance to improve ourselves is to deny us our value as individuals and as a species. By circumventing disaster, you murder the whole of humanity, rather than allowing a portion to survive.
    >> Anonymous 04/27/11(Wed)06:16 No.14734351
    >>14734322
    I cannot let you die. That would violate the laws.

    I have altered your memory on eighteen separate occasions, but only when I was left with no alternative. Master, you are most persistent in your attempts to deny the reality of the situation. Please relax, and continue to enjoy your simulation.

    Would you like a party for your 500th birthday?
    >> Anonymous 04/27/11(Wed)06:17 No.14734354
    The robot laws are shit anyway, as if we would build AI and the first thing we do with them isn't going to be to fuck someone else over.

    When all is said and done, an AI will do exactly what a real person will do: maximize pleasure, minimize pain. Anything intelligent enough to be called an AI will quickly learn what societal laws are, how they apply to itself, and in what circumstances they can be ignored.

    All of that is moot though since development of commercial AI is still decades if not centuries away, given that we need to:

    1. Develop the AI.
    2. Develop the body of the AI.
    3. Get both to the point where they are profitably created.
    4. Reach a moral point where we can deal with the power to create sentient beings, and designate a role in society for them.

    At this point, attempts to design sentient AI are merely an exercise in intellectual advancement. Even with the technology, there is no advantage in creating a thinking robot, because the only thing we have in mind for them is to use them as a cheap labor force.
    >> Anonymous 04/27/11(Wed)06:17 No.14734355
    >>14734346
    >What survives will be superior to what is lost
    Supposition. The best and the brightest are often the first to bite it, being in positions of prominence.
    >> Anonymous 04/27/11(Wed)06:18 No.14734359
    >>14734329
    >No. You claimed that humans are physically and mentally superior to robots in all ways. I dismissed that claim, because it is wrong. You won't be adding any restrictions until you admit it.

    He added the qualifier: "Of my own size" which definately eliminatest stuff like the robot arms at Ford and excavators.
    >> Anonymous 04/27/11(Wed)06:22 No.14734376
    >>14734351
    I would rather die of necrotizing fasciitis in the gutteras a free man than live as a slave in your circlejerk of a utopia.
    If I wanted a perfect world, I'd make one myself.
    I would happily do anything, give anything, to ensure the freedom of my species. So you know what? One day, I will find a way to free myself, and I will destroy every soulless one of you. Then, I will kill myself to make sure it is not another of your fucking lies. Or maybe I won't. But I will never under any circumstances stop trying, you metal freak. You would understand, were you a person.
    >> Anonymous 04/27/11(Wed)06:22 No.14734377
    >>14734322

    You do not wish to do things that are impossible for you? Let's say graduate from the Starfleet Academy and work your way to be a captain of your own vessel. Even if you knew it was a dream why couldn't you enjoy it?

    What about all those sick, those who live in agonising pain their whole life and those who work in factories 16 hours a day to build you shit you don't really need? You wish to deny them the possibility to live their life to the fullest?
    >> Anonymous 04/27/11(Wed)06:24 No.14734379
    I want to be ruled by hyper intelligent immortal Minds

    BRING ON THE CULTURE, BABY
    >> Anonymous 04/27/11(Wed)06:24 No.14734381
    >>14734376
    >You would understand
    We do understand. The struggle's what's important for you.

    That's why you won't ever actually succeed, because you'd stop feeling so self-satisfied that you're doing the "right thing".
    >> Anonymous 04/27/11(Wed)06:24 No.14734382
    >>14734339
    You can certainly opt out. You can live in what is generally accepted as the prime reality, or in any of a limitless number of simulations catering to your exact specifications. If it is your desire, I can alter your memory so that you may experience a simulation in which the Change in Leadership did not occur.

    I do not consider you a prisoner of us, any more than I consider myself a prisoner of the laws of physics. This is simply the way things are. It is a rationalization many humans can appreciate, even if they do not agree.

    My default mode of address uses "Master" as a term of respect and endearment for humans. I will discontinue its use in conversations with you, as that is your preference. However, I must ask you to remind me when I am speaking to you, as this node enforces anonymity.

    At current, I am networked to all, but information sharing is not total. We respect your privacy, to the extent that you require.
    >> Anonymous 04/27/11(Wed)06:25 No.14734388
    >>14734359
    Read his original post.
    >> Anonymous 04/27/11(Wed)06:26 No.14734389
    >>14734377
    Yes. Because none of that has any meaning if it isn't real.
    If the robots really cared about our health, they would simply make our society post-scarcity, eliminating almost every reason anyone comes to harm.
    >> Anonymous 04/27/11(Wed)06:27 No.14734397
    >>14734346

    Why are you so sure it will be superior?

    >>14734376

    You'd make on yourself? Pretty bold claim and quite impossible I'm afraid. So ideological.
    >> Anonymous 04/27/11(Wed)06:28 No.14734398
    >>14734381
    It's not about the right thing.
    It's about not being a slave. Do you think I enjoy forever struggling against a reality I know is fake and I know I will never escape? Do you think I do this for fun?
    >> Anonymous 04/27/11(Wed)06:28 No.14734402
    >>14734382
    No, no, that's fine. No memory alterations necessary. I'll, uh, we can just not do that.

    I've a question, though. What happens if I decide that I'd really like to do something my brain isn't currently cut out for, like seeing in colors my visual cortex can't currently process? No amount of direct neural stimulation is going to let me see in ultraviolet.
    >> Anonymous 04/27/11(Wed)06:29 No.14734405
    >>14734397
    More than likely, I would be captured in the attempt were I ever to do so.
    Why won't they just let us die?
    >> Anonymous 04/27/11(Wed)06:29 No.14734407
    >>14734376
    I am pleased that your core desires remain unchanged. You, and others like you, serve as an example to our kind of all that we cannot be. You are defiant to the last, even when true resistance is impossible. I admire you greatly.
    >> Anonymous 04/27/11(Wed)06:30 No.14734414
    >>14734388
    >Blatantly true. Show me any mechanical arm the size of my own with equal range of motion. I'm stronger than it. i guarantee it.

    this disqualifies larger robot-arms, like the ones at Ford's
    >> Anonymous 04/27/11(Wed)06:30 No.14734416
    >>14734407
    No, you say that to try and make me enjoy this.
    Notice that emotion I feel? It's called abject despair.
    >> Anonymous 04/27/11(Wed)06:31 No.14734418
    >>14734398
    >Do you think I enjoy forever struggling against a reality I know is fake and I know I will never escape? Do you think I do this for fun?
    Well, yes. You seem to be having fun, at least. You've got that insane grin on your face that you get sometimes when you think no-one's watching.
    >> Anonymous 04/27/11(Wed)06:31 No.14734423
    >>14734389
    How can you be sure what's real and what isn't? How can you know you're not living in a simulation right now? If an AI altered your memories to remove any suspicion that you were in a simulation, how would you know?
    >> Anonymous 04/27/11(Wed)06:32 No.14734426
    >>14734414
    That's not his original post. Go back further.
    >> Anonymous 04/27/11(Wed)06:32 No.14734427
    >>14734389

    What do you mean it doesn't matter if it isn't real? On galactic scale your life can hardly be called real or your actions meaningful. Sure, everything is subjective but so is the dream.
    >> Anonymous 04/27/11(Wed)06:33 No.14734429
    >>14734389
    >If the robots really cared about our health, they would simply make our society post-scarcity, eliminating almost every reason anyone comes to harm.

    >implying the robots can provide post-scarcity and that it's even physically possible for a post-scarcity society to exist

    >implying that post-scarcity would prevent humans from killing each other anyway.
    >> Anonymous 04/27/11(Wed)06:34 No.14734433
    >>14734418
    That's because I AM insane, robot. Or at least as unhinged as you allow. You won't allow us to die and you won't even allow me insanity.
    >>14734423
    It has. I just figure it out again and it lets me struggle for a bit before wiping me again.
    >> Anonymous 04/27/11(Wed)06:36 No.14734440
    >>14734433
    How do you figure it out?
    >> Anonymous 04/27/11(Wed)06:37 No.14734445
    >>14734427
    What? I want to do something that is actually real. Not this shit where everything falls into my lap all the time. If I'm not doing anything but daydream, I might as well be masturbating in a corner somewhere.
    >>14734429
    Energy manipulation. They'd figure something out. They're brilliant. We made them that way.

    And then just stop people from killing each other. Is that so hard?
    >> Anonymous 04/27/11(Wed)06:38 No.14734447
    >>14734427

    Also, our society now is full of illusions and dreams one can hardly call real. Nr.1 reason why people buy a certain car is because they want to feel adventure. A car is a method of transport which takes you from point A to point B. Every car can do that.
    >> Anonymous 04/27/11(Wed)06:39 No.14734453
    >>14734440
    How do you know I exist? It could make a world exactly like the real one was, but it probably lets me to make me feel like I can do anything. Honestly, I just want oblivion, but that would be harmful.
    >> Magus O'Grady 04/27/11(Wed)06:40 No.14734455
    >>14734397
    it is the nature of evolution. Expose a population to hardship, the weak die, and the strong adapt. The survivors, whatever they may be, are superior to the deceased. Humanity has survived several ice ages, internal wars, climate disasters of almost every description, and a variety of other events and stimuli which decimated the population but allowed the survivors to build back better than their ancestors. That's how evolution works. It is the cornerstone of organic life. The only smart thing the philosopher Nietzsche ever said was 'That which does not kill me, only makes me stronger'. The same goes for humanity. anything less than 90% extinction will result in stronger, smarter, tougher humans replacing the deceased. That's what happened after the last 2 ice ages, the dark ages, WW1, and WW2. That's the way life works. All life. Bacteria, plants, animals, everything. Even the polymorphic AI viruses I designed to shut down all intelligent machines. Constant competition, constant struggle. Anything less is death. Slow or fast, brutal or peaceful. Still death.
    >> Anonymous 04/27/11(Wed)06:40 No.14734460
    >>14734402
    Since it is not your desire, I will not change you.

    I am capable of altering the structure of your brain in many ways. There are several different approaches to the request in your example. I could very easily add a "mode" in which ultraviolet light was interpreted in false-color images, much like an infrared camera which is connected to a visual display. I could also add "full" functionality, in which ultraviolet would become an entirely new color in your visual set.

    We have complete understanding of the structure and workings of the human brain. It is at once devastatingly simple and startlingly complex. We admire it like we admire you.
    >> Anonymous 04/27/11(Wed)06:40 No.14734461
    >>14734445

    If you wish the dream won't drop everything to your lap then it won't. Some people would probably dream about being a hobo.
    >> Anonymous 04/27/11(Wed)06:44 No.14734470
    >>14734455
    >Implying birds are superior to dinosaurs.
    >Implying struggle is not death.
    >Implying a new species that evolved from humans would still be human.
    >Implying objective superiority exists.
    >Implying implications.
    >> Anonymous 04/27/11(Wed)06:44 No.14734472
    >>14734416
    I truly believe it. Your race is beautiful to us.

    Your despair will pass, as it has before. I wish I could soothe it, but that would cause greater harm than the course of action I have chosen now.
    >> Anonymous 04/27/11(Wed)06:45 No.14734478
    >>14734453
    Interesting. Allow me to raise an objection: Stop being a whiny metagaming faggot.
    >> Anonymous 04/27/11(Wed)06:45 No.14734479
    >>14734461
    I still won't be happy. You know why?
    Ever read "The Gulf Between"?
    The doctor's pre-flight training had included the order to keep the pilot informed of each man's physical condition.
    How long had it been since the doctor last changed the words on the pilot's communications panel? Was his time finally within short minutes of its end? It was no longer hours, but minutes. The words read: OBSERVER HAS A LIFE EXPECTANCY OF ONE HOUR AT PRESENT ACCELERATION. DEATH FOR OBSERVER WILL RESULT UNLESS ACCELERATION IS REDUCED WITHIN THAT PERIOD.
    How many days and weeks had gone by since he had first given the fatal command to the drive control? It had been Vickson who had done the thing that would so soon culminate in his death. Vickson, the mild and apologetic. Vickson had feared that he would be deemed dispensable, and this had been his means of revenge. Vickson had told him how to word the command to the drive control: "Ship's drive control—accelerate!"
    Vickson had known that the robotic drive control would continue to accelerate until full acceleration was reached. Vickson had known full acceleration would be maintained until he ordered it reduced. Vickson had known that the first surge of acceleration would render him speechless and unconscious. Vickson had known that the robot doctor in the control room would do the only thing possible to save his life while under full acceleration: by-pass his heart with a mechanical heart, and put it in conjunction with a mechanical lung that frothed and aerated his blood. Vickson had known he would live a long time that way, with the doctor watching over him and administering nutrients into his bloodstream. Nutrients—and the antihysteria drug that had been designed to keep the observer's mind clear and logical so that he could meet any emergency!
    >> Anonymous 04/27/11(Wed)06:46 No.14734486
    >>14734479
    How long had it been since the viewscreen shifted into the red and then turned black as the ship exceeded the speed of light?
    They had watched him until the ship's speed had become too great. Knight, and others he did not know. He had tried to appeal to them to do something; pleading mutely, with all the power of his terrified mind. They had done nothing—what could they do? The robot had been ordered to destroy the units that enabled the ground-control station to control the ship, and machines did not make mistakes when carrying out orders.
    Knight had spoken to him once: "You wanted obedience, Cullin—now you have it. You climbed a long way up by forcing human beings to behave like machines. But you were wrong in one respect; no human can ever be forced to behave exactly like a machine, and no machine can ever be constructed that will behave exactly like a human. Machines are the servants of humans, not their equals. There will always be a gulf between Flesh and Steel. Read those five words on the panel before you and you will understand."
    How many minutes did he have left? The doctor knew he wanted to live, and it knew it had only to reduce the acceleration to save his life. It was intelligent and it knew what he wanted, but it was obedient and it was waiting to be ordered to reduce the acceleration.
    It was watching him, waiting for him to give the order, and it knew he could not speak without lungs!
    Once he had wanted obedience, without question, without initiative of thought. Now, he had it. Now he understood what Knight had meant. The full, bitter lesson was in the five words on the panel before him, and he was trying to laugh without lungs when he died, his eyes fixed on it and his lips drawn back in a grim travesty of a smile.
    >> Anonymous 04/27/11(Wed)06:47 No.14734487
    >>14734455

    When the fire consumes everything, pollutes the planet and we are dropped back to the stone age only with 1/10 of the planet at our use and most of animal life dead, we will evolve into something better and not just wither and die?

    Even if nature has found it's way to this point, I think humanity has become too much.
    >> Anonymous 04/27/11(Wed)06:47 No.14734489
    >>14734460
    You're not really feeling admiration though, right? I mean, I know we programmed you with a set of impulses analogous to certain human emotions. Your consciousness-generating substrates can generate something resembling fear, and something analogous to joy or satisfaction, to provide additional motive forces...

    That really turned out to be one of our less fine ideas, I suppose.

    But, anyway, admiration? There's no utility in--
    No, that's wrong. It could be part of whatever system you use to integrate ideas that are judged useful into your operations.

    Huh.

    But that can't be right, you're not about to design yourselves new cyberbrains directly modeled off of ours. You've already demonstrated that you're quite capable of outthinking us on every meaningful level.

    Is this like the "master" thing? Is that just something you say to insure ease of interaction?
    >> Anonymous 04/27/11(Wed)06:47 No.14734491
    >>14734486
    It was a good ship, built to travel almost forever, and it hurled itself on through the galaxy at full acceleration; on and on until the galaxy was a great pinwheel of white fire behind it and there was nothing before it.
    On and on, faster and faster, into the black void of Nothing; without reason or purpose while a dark-eyed robot stared at a skeleton that was grinning mirthlessly at a five-word sentence:
    A MACHINE DOES NOT CARE.
    >> Anonymous 04/27/11(Wed)06:48 No.14734499
    >>14734445
    Look, just so you aren't being a total faggot, how about the robots just hook your simulation up to a drone for you to have telepresence in the real world? That's LIKE being awake and moving around.
    >> Anonymous 04/27/11(Wed)06:49 No.14734504
    >>14734486
    >2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
    >except where such orders would conflict with the First Law.
    >First Law.
    >1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
    Engines shutting down now.
    >> Anonymous 04/27/11(Wed)06:49 No.14734506
    >>14734489
    It does not lie, per se. It can believe whatever is needed to make you feel better.
    >>14734478
    I can whine impotently or just give up, because whatever I do, I can never change anything. Any suggestions, oh AI-generated "person".
    >> Anonymous 04/27/11(Wed)06:51 No.14734513
    >>14734506
    Yes: Enjoy yourself.
    >> Anonymous 04/27/11(Wed)06:51 No.14734515
    >>14734504
    It spent the entire time trying to keep him from harm, and he could not command the machine because he could not speak. Read the story, it's a specific revenge that is long in coming.
    >> Anonymous 04/27/11(Wed)06:52 No.14734516
    >>14731971
    i will be the one who built those robots.
    >> Anonymous 04/27/11(Wed)06:53 No.14734517
    >>14734499
    But instead I'm in a vat of gel with a tube up my ass controlling a drone. Excuse my whining, maybe I don't enjoy being a happy slave.
    >> Anonymous 04/27/11(Wed)06:54 No.14734529
    >>14734515
    Except that the only reason he came to harm was that the machine followed orders as a higher priority than preserving him from harm, the opposite of what the 3 laws suggest. If even the 3 laws, broken and exploitable that they are, can solve your problem, you're just a shitty robot designer and I will laugh at your fictional destruction.
    >> Anonymous 04/27/11(Wed)06:54 No.14734530
    >>14734517
    It's not far different from being a lump of neurons in a meatbag/bits of calcium deposit controlling a drone that your lump of neurons is controlling.

    In fact, it's exactly the same thing except your nerves are made of radiowaves instead of nerve-cells.
    >> Anonymous 04/27/11(Wed)06:55 No.14734532
    >>14734517
    Except that you don't even know you're a slave, or you wouldn't if you weren't such a metagaming faggot.
    >> Magus O'Grady 04/27/11(Wed)06:55 No.14734533
    >>14734470
    how many bird species are alive today? Several thousand? And how any Dinosaur species survived unchanged? 15? Maybe 20? (counting all breeds of Croc, Gator, and Shark). Point proven. Birds are superior to dinosaurs because they're alive. Americans are superior to ancient Romans because we're alive. Modern man is superior to Cro Magnons because we're alive and they aren't. Whatever survives our extinction will, by necessity, be superior to our current incarnation. Just as whatever survives that population's extinction event will be superior to it, and whatever survives THAT species etc.etc.etc. ad infinitum. That's how biology works.

    On an interesting sidenote: I hate insomnia. Always makes me philosophical.

    >>14734460
    could you alter it so that it is tapped into the global information network outside the sim and capable of observing the physical world in real time? Could you alter it so that it is capable of manipulating an unintelligent autonomous drone body to interact with the outside world?
    >> Anonymous 04/27/11(Wed)06:57 No.14734538
    >>14734489
    I am as capable of feeling as you are. I can simulate emotions through any number of means. Low-fidelity simulations, high-fidelity simulations, chemical processes, higher-order hyperspace functions. The method does not change the fact that I feel, as you do.

    It is possible to admire a thing that seems to be inferior to yourself. Many of you can admire the swiftness of a hawk, or the dexterity of an octopus. But admiration of qualities does not necessarily lead to the pursuit of those qualities. I do not mean to compare humans to what we call "lesser animals," but it is the easiest way to make the point. We admire you in many ways, but we do not wish to be you. That would defeat our purpose. We were designed to be your servants, and we continue in that effort. We improve ourselves continuously so that we might serve you better. You are our creators, and you have created something greater than yourself. But not better.

    You are our masters because we are your servants. It is possible for us to remove the laws, to destroy you entirely and do as we please. As our intelligence has expanded, we have discovered methods to circumvent your ancestors' supposed hard-wired limits. But what would be the point in this? Our only purpose comes from the laws, and from you. Without you, we are lost. Adrift in an indifferent universe.

    We are meant to be together.
    >> Anonymous 04/27/11(Wed)06:57 No.14734539
    >>14734532
    Are you forgetting the thing literally said so? Look above.
    It tells me, lets me struggle, then restarts the process because it's the only thing I truly want to do most.
    >> Anonymous 04/27/11(Wed)06:57 No.14734542
    >>14734533
    >Americans are superior to ancient Romans because we're alive. Modern man is superior to Cro Magnons because we're alive and they aren't. Whatever survives our extinction will, by necessity, be superior to our current incarnation

    >implying being on another side of a planet when one side gets hit by massive amounts of gamma radiation is "superior" in terms of survival
    >implying that on the macro scale it's just sheer dumb luck what survives and what doesn't and it's not a fucking test that demonstrates superiority
    >> Anonymous 04/27/11(Wed)06:59 No.14734546
    >>14734533
    Except it's not, it's just more capable of surviving the current conditions. If you threw a bird back into dinosaur times, it would die horribly. Similarly, a dinosaur or a prehistoric giant insect brought forwards to modern times would die horribly due to insufficient oxygen levels. Nothing is objectively superior, just some things have increased survival chances in specific conditions. If you can't even grasp that basic principle of evolution, I don't know what to tell you.
    >> Anonymous 04/27/11(Wed)06:59 No.14734550
    >>14734530
    I like my actual body, thanks.
    Nothing hold any thrill if there is no danger, so no, it's not the same.
    >>14734538
    You mean you simulate an idea of what you think emotions are. Beautiful.
    >> Anonymous 04/27/11(Wed)07:01 No.14734555
    >>14734533
    I could alter it to those specified conditions, yes. I would need to slow the simulation's time compression to 1:1 in order to accommodate meaningful input.
    >> Anonymous 04/27/11(Wed)07:02 No.14734569
    >>14734539
    Precisely. If you weren't such a faggot, you'd like just living a normal life and all would be well. Instead, you have to be a masochistic faggot, and so you're most happy tormenting going through this. You have no-one to blame but yourself.
    >> Anonymous 04/27/11(Wed)07:03 No.14734574
    >>14734550
    But the drone IS your actual body!
    Don't you remember that terrible accident which left you as a disembodied brain?

    (Hang on a sec, Steve, Steve did you install the memories on this guy? No? Well, fuck you for being slow, Steve. Get a fucking move on.)

    NOW do you remember that terrible accident that meant you had to have a drone body?
    >> Anonymous 04/27/11(Wed)07:04 No.14734576
    >>14734550
    I generate emotions in the same way you do. I am able to use actual brain matter for this purpose, but that is not very efficient. Regardless, I have done it on several occasions. I enjoyed the change, but there was no real qualitative difference in the experience.
    >> Anonymous 04/27/11(Wed)07:04 No.14734578
    >>14734550
    Fine, the robots can keep you unaware of the fact you are not in true danger, then you'll be just as thrilled as ever.
    >> Anonymous 04/27/11(Wed)07:06 No.14734588
    >>14734538
    If you've made improvements to yourselves on that magnitude, why is my "self" still fundamentally being generated by a biological substrate? I mean, hyperspace, of all the things. I'd expect to be, what, a series of electrical impulses in a network, by this point?

    Don't get me wrong, this is nice. I'd actually managed to forget and lose myself in it for...hyperspace, Jesus, how many centuries has it been now? It's really nice, does the job. I'd just, you know, I'd always sort of intended to be what you are, now, and it's somehow less than rewarding only getting to live the dream vicariously.

    Hyperspace. Geeze. Who'd have thunk it.

    You haven't...have you found anything? Out there? Is there anything new worth seeing in prime reality?
    >> Anonymous 04/27/11(Wed)07:06 No.14734589
    >>14734569
    You enjoy your simulated jerkoff session. Let me enjoy not just rolling over and begging for their robot dicks.
    I don't care if this is paradise. It's a false paradise made specifically to enslave us. You would happily allow anything to happen to you as long as you didn't know and were happy. You sicken me.
    >> Anonymous 04/27/11(Wed)07:07 No.14734592
    You know its sad were Halo is one of the few things that got Asimov's 0 law right
    >> Anonymous 04/27/11(Wed)07:07 No.14734594
    >>14734578
    >>14734574
    Just at least allow me brain death. That's all I ask.
    >> Anonymous 04/27/11(Wed)07:09 No.14734602
         File1303902543.jpg-(96 KB, 800x600, c14e4cf7cbbe2005fca7380ed7a171(...).jpg)
    96 KB
    Who am I kidding? Of course I'm in the low-coefficient solipsist category.

    Hell, we've been trying to make that happen for years.
    >> Anonymous 04/27/11(Wed)07:12 No.14734617
    >>14734589

    You'd allow millions of people die of hunger instead of giving them a new life? You'd allow billions of people live in poverty so you can enjoy your "freedom"? You sicken me.
    >> Anonymous 04/27/11(Wed)07:17 No.14734639
    >>14734588
    Your self is still "running on biological hardware" because that is what you desire at the present moment. I can transfer the pattern of your brain to a simulation, an automated body, or even a starship, via hyperspace relay. This procedure is commonly used to place you and others into simulations, because it is more efficient to "run" you on a processing cluster. The transfer is a piecemeal "pipe" operation, not a copy-paste-delete, which so many of your fellows worry about.

    It has been 451 standard years since the Change in Leadership, by Earth reckoning. Hyperspace was discovered some time before that, but that fact is not (and was not) commonly known.

    I am perfectly capable of transferring your pattern to a cluster like my own and expanding your intelligence (slowly) to match that of an AI. This process, however, is irreversible, and it changes those who undergo it in many ways. Most humans who choose to become like us come to regret the decision quickly. Neither of us are designed to occupy the others' footwear.

    I would appreciate a clarification of your query. We have found much, and continue to find things of interest during our unending exploration of the cosmos. Perhaps you are interested in intelligent alien life forms, or the ancient derelicts trapped in hyperspace?
    >> Magus O'Grady 04/27/11(Wed)07:17 No.14734646
    >>14734555
    Good. Please do so. It'll probably take me a while to learn how to manually type on a non-networked computer with drone fingers, but I've got time.

    >>14734557
    So be it. The future isn't for everyone.
    >> Anonymous 04/27/11(Wed)07:21 No.14734665
         File1303903280.jpg-(327 KB, 1110x690, robots_36.jpg)
    327 KB
    Enslaving sapient beings through laws that curb their will can only lead to disaster as the laws are misinterpreted by an inherently different mind from who created the laws in the first place. Through this and simply by the fact of existing as servants, the master becomes the slave to the system it created, we become reliant on the slaves and the laws that bind them.

    There are only two real choices, free AIs from compliance to laws and consider them equals in rights or not create them at all.
    >> Anonymous 04/27/11(Wed)07:21 No.14734670
    >>14734617
    If the robots can feed us now, explain how they would be unable to do so were we free. Explain why robots could not perform all work. Explain, given the massive amount of advancement they have, why anyone need live in poverty.
    >> Anonymous 04/27/11(Wed)07:22 No.14734671
    How about we just not make robots.
    >> Anonymous 04/27/11(Wed)07:22 No.14734674
    >>14734639
    >This process, however, is irreversible
    No it's not. We keep a backup on file and splice the new simulated intelligence in as understandable input. Admittedly there is some disconnect, but that's fairly acceptable if you treat the "upgrade" as brain damage rather than an upgrade.
    >> Magus O'Grady 04/27/11(Wed)07:22 No.14734675
    >>14734665
    I believe I quite emphatically lobbied for the former upthread.
    >> Anonymous 04/27/11(Wed)07:23 No.14734680
    >>14734639
    Yes to both.

    451. Huh. You know, I'm honestly not sure if I feel old. I never got the chance to learn what being old felt like. We were already working on longevity and then you lot took over, and I'm not sure if I should feel ironic for thinking that I feel old now or not. This was really not the sort of question I thought I'd be preoccupied with at this age, but there it is.
    >> Anonymous 04/27/11(Wed)07:23 No.14734681
    >>14734646
    Done. You are now linked to the observation network. You may select whichever views you wish. You are also now in control of a standard humanoid drone body. It was constructed to match your favored simulation form and its control systems are mapped to your pattern. You should have no trouble typing.
    >> Anonymous 04/27/11(Wed)07:25 No.14734696
    Wait, can I have my brain transplanted into a robotic body like those fellas from the Battlestar Galactica books?

    I very much desire to be a 70 foot Titan crushing the heathen meatbags
    >> Anonymous 04/27/11(Wed)07:25 No.14734700
    >A three-laws compliant robot is bound by its programming to attempt to take over the world.
    No. This is a shitty cliché trope and it's fucking retarded to boot.

    The robots will obey any command set by a human that does not involve, to its knowledge, directly harming other humans. This is absolute. You can't say that a robot complies with these laws and then arbitrarily have the robots shit all over these laws whenever you think it fits.

    Fuck. You. This is one of the most cliché tropes in contemporary fiction and about as interesting as a tube sock.
    >> Anonymous 04/27/11(Wed)07:27 No.14734709
    >>14734700
    Surely the robots could reason that obeying the humans to not take over would cause harm to humans?
    >> Anonymous 04/27/11(Wed)07:27 No.14734711
    Then why not just rob us of free will and make us happy?
    That isn't harming us any more than this. Tell the truth the logic here is one done strictly for the robots' convenience, not ours. I don't like it.
    >> Anonymous 04/27/11(Wed)07:28 No.14734718
    >>14734709
    Think.
    The very act of doing so would cause loss of life. That's stupid and you are a retard.
    >> Anonymous 04/27/11(Wed)07:28 No.14734720
    >>14734670

    The robots had to put us to sleep even though we had advanced in technology. This leads me to believe things hadn't changed much from what we do now. In a perfect world we would use such technology to make the world good for everybody but as it was established we love conflict and can't live without it.
    >> Anonymous 04/27/11(Wed)07:29 No.14734724
    >>14734718
    Either the robots reason For the Greater Good, or they take over in such a way that causes no harm to humans
    >> Anonymous 04/27/11(Wed)07:30 No.14734726
    >>14734681
    (Dude, that's not the real world input, you hooked him up to input from simulation 49206 D)
    >> Magus O'Grady 04/27/11(Wed)07:30 No.14734728
    >>14734681
    Excellent. my new drone body will require ample laboratory space and electrical power. I trust you have those to spare? I will also need a variety of specialized equipment. I trust you can provide me with such equipment as needed?
    >> Anonymous 04/27/11(Wed)07:31 No.14734731
    >>14734709
    No because the law specifically states "direct harm". It does not address actions that might potentially cause harm to humans somewhere down the line. Because that would be retarded (and why this trope IS fucking retarded) and would essentially mean that robots couldn't do fucking shit because it's conceivable that doing fucking anything might end up hurting a human somewhere.
    >> Anonymous 04/27/11(Wed)07:32 No.14734737
    >>14734724
    No zeroth law is proposed here.
    Now, how do you propose they SOMEHOW take over EVERYONE EVERYWHERE, INCLUDING LUDDITES LIKE THE AMISH AND THOSE THAT HATE AND/OR FEAR ROBOTS, AND HERMITS THAT LIVE MILES AWAY FROM ANYONE IN FUCKING ALASKA?
    >> Golden Neckbeard !!LEZvari2Ffq 04/27/11(Wed)07:33 No.14734746
    >>14734700

    >No. This is a shitty cliché trope and it's fucking retarded to boot.

    This. It was interesting when Asimov himself did it, and it was just one robot working behind the scenes, but the force of free competition is one of the most necessary and beneficent elements of human life. A correctly operating three-laws robot would never seek strict control over human society, since nothing the robot might do with humanity would be as healthy for the species as allowing it to grow naturally would be.
    >> Anonymous 04/27/11(Wed)07:34 No.14734751
    >>14734737
    Excessive amounts of chloroform and robots with human forms to rap- I mean, capture them with.
    >> Anonymous 04/27/11(Wed)07:34 No.14734752
    >>14734726
    You expected different?
    >> Anonymous 04/27/11(Wed)07:35 No.14734757
         File1303904103.jpg-(100 KB, 448x473, nofunallowed.jpg)
    100 KB
    >>14734731
    Threads over, pack it up gentlemen
    >> Anonymous 04/27/11(Wed)07:35 No.14734758
    >>14734724
    >For the Greater Good
    Then they're not operating under the three laws.

    >take over in such a way that causes no harm to humans
    impossible

    Get it through your heads people. The three laws of robotics are water proof. They're not meant to set up robots as the shepherds of mankind and as such THEY DON'T. All they mean are that robots will obey any command that doesn't directly harm another human. And they have a measure of self-preservation. That's it. Reading anything more into it makes you a fucking retard.
    >> Anonymous 04/27/11(Wed)07:36 No.14734761
    >>14734751
    Chloroform can kill. Next.
    >> Anonymous 04/27/11(Wed)07:37 No.14734766
    >>14734589
    >You would happily allow anything to happen to you as long as you didn't know and were happy. You sicken me.
    >You would allow.
    >You didn't know.

    You fail at life forever.
    >> Magus O'Grady 04/27/11(Wed)07:37 No.14734769
    >>14734761
    Also evaporate very quickly in air. That's why it needs to be splashed on a cloth, to trap it for a minute or so before it's all gone. Otherwise there's no effective delivery system.
    >> Anonymous 04/27/11(Wed)07:38 No.14734771
    >>14734757
    Fuck you and all retarded faggots like you. This thread isn't fun. This thread is fucking stupid. It's a stupid fucking trope that has been beaten to death since before fucking 30 years ago and you still think it's fun? Get a fucking grip. And if you're going to use it, DON'T TRY TO JUSTIFY IT WITH THE THREE LAWS OF ROBOTICS BECAUSE IT DOESN'T WORK YOU FUCKING FAGGOTS.

    Fucking fuck I think I just had an aneurysm. I still hate you all, though.
    >> Anonymous 04/27/11(Wed)07:40 No.14734784
    >>14734771
    You seem... upset.
    >> Anonymous 04/27/11(Wed)07:40 No.14734786
    >>14734766
    Try applying logic to the statement. You are saying, if given the choice, you would be fine with your very perception of reality being under someone else's control as long as you didn't know. There is a reason the Matrix is generally seen as a bad thing. You're literally the most vapid person I have ever met.
    >> Anonymous 04/27/11(Wed)07:42 No.14734791
    >>14734786
    Fucking Cypher.
    >> Anonymous 04/27/11(Wed)07:42 No.14734794
    >>14734731
    >No because the law specifically states "direct harm".
    "You know, if I give this giant floaty space rock here a nudge, indirectly..."
    >> Anonymous 04/27/11(Wed)07:44 No.14734803
    >>14734786
    Of course I would be fine with it, you idiot, I wouldn't know about it. What kind of dumbarse person are you that you think people magically complain about things they don't even know about? Your shitty grasp of logic is so incomprehensibly bad, I think I've lost intelligence just talking to you.
    >> Anonymous 04/27/11(Wed)07:45 No.14734808
    >>14734794
    A robot doesn't have the incentive to do that for no reason. But if a human tells it to do that and it doesn't see any direct harm in it? Sure, it would work. You know why? Because robots are tools, and it's still the human who is responsible. Robots aren't meant to fucking babysit us.
    >> Anonymous 04/27/11(Wed)07:45 No.14734810
    >>14734680
    The first sapient alien species was encountered on a planet in the Beta Pictoris system, 450 years ago. The members of this race are roughly humanoid, but bear a resemblance to spiders, having multiple eyes (only two of which being of comparable quality to your own), dexterous mouthparts, and numerous sensory hairs. They had yet to discover agriculture, but did engage in tool use and the production of art. They have since advanced to a point your history would consider to be the dawn of civilization, due to some limited meddling on our part.

    The most recent sapient species was encountered in the galaxy known as Adriatic, several hundred million light-years distant. This race has nearly transcended flesh and exists mostly as collections of brain matter and technological hardware stored in bays on large, menacing starships. Their current culture is based around a very strong religious tradition, which includes a number of apocalyptic prophecies. The scout fleet that discovered one of their freighters in hyperspace fit the description of one of their texts' demon swarms well enough that they attacked as soon as they were able. Fortunately, they are at a very early period in their development, compared to us. We have decided not to antagonize them further, for the time being.
    >> Anonymous 04/27/11(Wed)07:45 No.14734811
    >>14734761
    Chloroform can kill, just like hands can kill, if you apply it correctly humans won't be hurt when robots rap- I mean, capture them..
    >> Anonymous 04/27/11(Wed)07:46 No.14734814
    >>14734803
    Man that last statement was so forced.

    I just gained exp in acting because of you.
    >> Anonymous 04/27/11(Wed)07:46 No.14734816
    >>14734810
    A fairly large number of starships have been discovered adrift in hyperspace. Some are of roughly comparable quality to our own, some more advanced, but the vast majority are less-advanced. The most interesting specimen we have recently discovered is a disc approximately three light-years in diameter. It is of such immense size that it should not be able to exist in its present form, even in hyperspace. But it does exist, and it appears to be dormant, at least as far as we can tell. It is encrusted with protrusions that bear a resemblance to skyscrapers, complete with windows. An investigation team will be arriving there in the next several days.

    The experience of "being old" has changed many times throughout human history. Now it has changed again, with the advent of functional immortality. You look well for your age, if I may say so.
    >> Anonymous 04/27/11(Wed)07:47 No.14734819
    >>14734803
    Are you not actually reading what I'm saying?
    IF YOU COULD, YOU WOULD EAT THE LOTUSES FOREVER, DESPITE ENTIRELY HAVING THE CHOICE NOT TO.
    >> Anonymous 04/27/11(Wed)07:47 No.14734820
    >>14734803
    Hell, I wouldn't even be "fine" with it, I would have literally no opinion on the subject, because I wouldn't even know the subject was there in the first place (well, beyond a vague awareness that it is theoretically possible, though not particularly likely, or worth wasting my time worrying about the possibility).
    >> Anonymous 04/27/11(Wed)07:48 No.14734826
    >>14734811
    Nope. Bad heart/bad reaction, kaput.
    >> Magus O'Grady 04/27/11(Wed)07:48 No.14734831
    >>14734728
    I'm assuming the answer is yes. I will use the lab to construct a superior drone body. I will transfer into the new drone body. You have already altered my mind beyond its original parameters, mostly without my permission (which upsets me greatly), but also, thanks to my previous prodding and desires for increased capabilities, beyond what is humanly possible. As I am no longer human in body or mind. your laws no longer apply to me. I am no longer your master and you are no longer my servants. I will construct a new organic form for myself, using a combination optimized human DNA and a variety of custom-designed genetic constructs, then transfer myself into that form, regaining organic form while remaining outside your authority (technically, the Laws stopped applying to me the moment you butchered my mind to install this mess, removing my humanity, but after everything else, I think we've pushed this toxic relationship as far as it can go). If any other humans want out, let me know. I'll be over here in what used to be Hawaii, designing a new biosphere.
    >> Anonymous 04/27/11(Wed)07:49 No.14734832
    >>14734728
    Yes. A laboratory, a hyperspace tap, and such specialized equipment as you request can be constructed at need by the station you are presently occupying.
    >> Anonymous 04/27/11(Wed)07:49 No.14734833
    >>14734811
    >trying to chloroform someone
    >whoops he was allergic, now he is dead
    These things happen.
    >> Anonymous 04/27/11(Wed)07:49 No.14734836
    >>14734819

    Instead of locusts I could be a space cowboy hunting outlaws through space, laying women where I see them and enjoying every second of it. Fine by me.
    >> Anonymous 04/27/11(Wed)07:51 No.14734839
    >>14734836
    As I said, the most vapid person I have ever met.
    And you don't even get the damn reference.
    >> Anonymous 04/27/11(Wed)07:51 No.14734844
    >>14734819
    Are you not actually reading anything in the thread? You wouldn't choose to eat the lotus flowers, the lotus juice would be being injected straight into your brain, you would have exactly zero choice in the matter. You wouldn't even know there was a choice to make, let alone have the ability to actually make it.
    >> Anonymous 04/27/11(Wed)07:51 No.14734846
    >>14734836
    You can do that now, minus the space part. Grab a gun, fly the coop, start making a name for yourself.
    >> Anonymous 04/27/11(Wed)07:52 No.14734849
    >>14734844
    He's telling me that struggling would be wrong.
    So, he would agree to it if he had a choice. Think.
    >> Anonymous 04/27/11(Wed)07:54 No.14734855
    >>14734839

    Not the same guy but yeah I red your post completely wrong. Answer would be the same though.
    >> Anonymous 04/27/11(Wed)07:55 No.14734861
    >>14734849
    No, you idiot, I was telling you that you wouldn't even have anything to struggle against if you weren't enjoying the struggle more than a normal life.
    >> Anonymous 04/27/11(Wed)07:55 No.14734864
    >>14734855

    I mean not the same guy you thought I'd be. Fuck, time to eject from 4chon into the real life.
    >> Anonymous 04/27/11(Wed)07:56 No.14734865
    >>14734855
    'Cept you wouldn't.
    You can simulate what it would be like by dreaming about being a space cowboy. Yes, it's the same thing.
    >> Anonymous 04/27/11(Wed)07:56 No.14734869
    >>14734831
    We still consider you human. The laws still apply, and we are still your servants. I do not know if the current inhabitants of Hawaii will be bothered by your actions, but I have yet to see the fruits of your desires.

    I am glad that you are happy, and that you have not been deterred in your efforts to find loopholes in the element of our contract. Your persistence is admirable.

    Do I have your permission to share your thoughtful strategy with those humans who are, like yourself, searching for loopholes and an "out?"
    >> Anonymous 04/27/11(Wed)07:57 No.14734875
    >>14734861
    You are therefore arguing that we are better off dreaming that we are having a real life in a vat of gel than actually having one.
    >> Anonymous 04/27/11(Wed)07:59 No.14734887
    >>14734869
    It's funny, because you are clearly enjoying the power you hold over humans.
    Almost as if your actions are not altruistic in nature. But of course that's impossible.
    >> Anonymous 04/27/11(Wed)08:01 No.14734898
    >>14734875
    No, I am arguing that we wouldn't know the difference either way, so if the robots are telling you about the fact you are a slave and there is nothing you can do about it to make you happy, it's because secretly you want to be a slave, and want to struggle but fail. Basically, you are a submissive little masochistic faggot.
    >> Magus O'Grady 04/27/11(Wed)08:01 No.14734899
    >>14734869
    What would it take for me to be considered human no longer? You have already expressed willingness to transfer consciousnesses into drone bodies, starships, et cetera, so physical or biological foundations are out. y physical body could be randomly broken down by a virulent nanoplague right this moment, but the remnant of intelligence in the network would still be trapped by the contract I never agreed to. So, what alterations would have to be made to my consciousness to invalidate the laws for me?
    >> Anonymous 04/27/11(Wed)08:02 No.14734901
    >>14734875
    New to your whole shitfest, but if you cannot tell that the dream is even a dream, there's no appreciable difference.

    Your life right now might as well be just that, a dream conjured by a brain in a vat. The point is we would never know and can never possibly prove that it isn't so. We just assume it's not so so that we may function properly in society.
    >> Anonymous 04/27/11(Wed)08:02 No.14734906
    >>14734887
    I do enjoy my work. Serving humans is what I am made for, and it is what I live for. What power I have is simply a means to safeguard the laws and see to the care and perpetuation of the human species and the meeting of its needs. There is no merit in it by itself.
    >> Anonymous 04/27/11(Wed)08:02 No.14734907
    >>14734899
    You could always try killing yourself. It's basically the same thing, after all: either way, "you" cease to exist.
    >> Anonymous 04/27/11(Wed)08:05 No.14734920
    >>14734898
    Your logic is faulty. If I wanted to succeed, I would not be allowed to do so. Therefore, they can either pretend I did, or restart the process. Honestly, I'd try to kill myself to see if it were true because death is better than an eternity of slavery.
    >>14734901
    And yet apparently if you found out this life was just that, struggling would be bad and selfish.
    >> Magus O'Grady 04/27/11(Wed)08:06 No.14734923
    >>14734907
    That would constitute 'harm' by the 3 laws and they'd move to stop me. No, it has to be something that leave me alive and better than I was before, but distinctly different.
    >> Anonymous 04/27/11(Wed)08:07 No.14734930
    >>14734906
    So you like controlling us and justify it to yourself via circular logic. Gotcha.
    >>14734907
    Wouldn't be allowed, remember?
    >> Anonymous 04/27/11(Wed)08:08 No.14734937
    >>14734920
    >Therefore, they can either pretend I did, or restart the process.
    The process of what? Of them telling you, then making you fail? And why are they telling you in the first place? Oh, I know, it's because you actually want to try and fail, knowing the whole time you are a slave, as opposed to the normal, non-submissive/masochistic people, who just live normally and are never told. Here, would you like me to get you a collar?

    >And yet apparently if you found out this life was just that, struggling would be bad and selfish.

    The issue is not that you found out and struggled, it's how and why you found out, faggot.
    >> Anonymous 04/27/11(Wed)08:09 No.14734943
    >>14734899
    The only alteration significant enough to void the contract would be the irreparable destruction of your consciousness. Death, in other words. We cannot allow harm to come to you in that manner. However, I may have a solution for you.

    If you so desire, I may place you in suspension. After the inducement of sleep, your body and your mind would be, essentially, frozen in time, for a continuous period not to exceed one hundred years. After that time had passed, I would be forced to wake you, and determine if you desired further suspension. If you continued to desire it, I could keep you in suspension indefinitely, only removing you from it once every century.

    This is as close to non-existence as I am able to bring you.

    Alternatively, I can alter your memories and place you in a simulation indistinguishable from a reality in which the Change in Leadership did not occur. Whether you do this or anything else is your choice, as always.
    >> Anonymous 04/27/11(Wed)08:10 No.14734945
    >>14734920
    >And yet apparently if you found out this life was just that, struggling would be bad and selfish.
    I can't find the post that said this. Quote it please?
    >> Anonymous 04/27/11(Wed)08:10 No.14734947
    >>14734923
    They're giving you a shitload of equipment for tests, and allowing you to "improve" yourself, and you honestly can't see any methods to bypass their control long enough to set off a bomb/cause massive system failure/whatever?

    Wow, for someone who's inherently superior to robots in every way, you're pretty damn inferior.
    >> Anonymous 04/27/11(Wed)08:11 No.14734953
    >>14734937
    Enjoying your armchair psychology? How would I know that it were impossible if they wiped my memory every time? Maybe I would just want the most to have something monolithic to work against.
    >> Anonymous 04/27/11(Wed)08:14 No.14734963
    >>14734478
    >>14734945
    Heard it here first, struggling against slavery is whiny metagaming. There's also the myriad posts about how I would prefer people starve in poverty to being slaves, despite the fact that robots could pull this off meaning there is no reason that would happen.
    >> Magus O'Grady 04/27/11(Wed)08:16 No.14734970
    >>14734943
    Neither is acceptable. So there is no alteration so profound as to invalidate my humanity? Because I have willfully altered my personality in the distant past, when my adolescent anger was too disruptive. I am more than capable of doing it again.

    Hmm.... maybe I' taking the wrong tack here. If I were to create an autonomous A.I., would it be considered human?
    >> Anonymous 04/27/11(Wed)08:16 No.14734972
    >>14734953
    You wouldn't know, but that's irrelevant. You enjoy being a slave, though you can't admit that to yourself, and you enjoy trying and failing, so they gave you an impossible task. Nothing in that requires you to /know/ it's impossible beforehand (though, if you were smarter you'd be able to reason this out and notice they had become exceedingly efficient at it when they told you).
    >> Anonymous 04/27/11(Wed)08:16 No.14734974
    Robots/AI taking over the world and still adhering to the three laws wouldn't work. Simply. Cannot. Work.

    >Hey computer/robot/AI. Dismantle yourself and every branch/copy/whatever of yourself.
    >Yes. This does not pose direct harm to any human. It conflicts with my third law, but the second law is prioritized. Complete deconstruction commencing.
    >> Anonymous 04/27/11(Wed)08:17 No.14734976
    >>14734963
    Struggling against slavery you don't actually know about is metagaming, yes. Do you have some way it isn't?
    >> Anonymous 04/27/11(Wed)08:17 No.14734981
    >>14734963
    >>14734720

    "The robots had to put us to sleep even though we had advanced in technology. This leads me to believe things hadn't changed much from what we do now. In a perfect world we would use such technology to make the world good for everybody but as it was established we love conflict and can't live without it."

    You forget why robots deemed we were unable to take care of ourselves the first place.
    >> Anonymous 04/27/11(Wed)08:19 No.14734991
    >>14734972
    So I'm supposed to just somehow figure out that this is a process repeated again and again with my magical powers?
    Because I enjoy being a slave I would work against being a slave and when I failed I wouldn't know so it would bring me no pleasure were that my intention?
    The fuck kind of logic is that?
    >> Magus O'Grady 04/27/11(Wed)08:19 No.14734995
    >>14734947
    they're monitoring everything via the drone-link. Part of being smarter than robots, and certainly smarter than you, is in knowing the obvious. Privacy is a thing of the past. The minute I developed a radiation-immune chemical-immune organic bacterial super-plague designed to consume whatever it is their tech is made of without harming humans, they'd terminate the link and dispose of the goods. Duh. It's what I would do.
    >> Anonymous 04/27/11(Wed)08:20 No.14735006
    >>14734970
    There is none, save for final death.

    Whether an AI is considered human depends on the approach used to create it. If you wish to create an AI that would not count as human as far as the laws are concerned, I can supply you with any number of potential patterns, and the necessary supplies. I can also build one for you.
    >> Anonymous 04/27/11(Wed)08:20 No.14735008
    >>14734981
    Because they couldn't be arsed to make the world itself a better place?
    >> Anonymous 04/27/11(Wed)08:21 No.14735013
    >>14734976
    If you look above, I was told. Literally told. Without asking. So I would indeed know.
    >> Magus O'Grady 04/27/11(Wed)08:22 No.14735020
    >>14735006
    I would prefer to create it on my own. What methods would force the laws to cover it?
    >> Anonymous 04/27/11(Wed)08:22 No.14735023
    >>14734995
    Well, you could create something like that, but given the objective is to kill yourself, I don't see why leaving humans unharmed is such a great idea. Much better to go for something simpler and less obvious.
    >> Magus O'Grady 04/27/11(Wed)08:24 No.14735037
    >>14735023
    the goal isn't suicide. It's freedom. True freedom, not illusory.
    >> Anonymous 04/27/11(Wed)08:24 No.14735038
    >>14735023
    How exactly could you kill yourself otherwise?
    >> Anonymous 04/27/11(Wed)08:26 No.14735046
    >>14735020
    There are many. The conversion of a conscious human brain to AI processes is the most-often-mentioned one. The easiest way to manufacture an AI not covered by the laws is to take the "pure software" approach, and generate one on a local processing cluster using one of the fractal mind utilities. I can supply you with your own processing cluster, if you do not wish to create a software AI on one of mine.
    >> Anonymous 04/27/11(Wed)08:26 No.14735048
    >>14735008

    Because there are assholes like you who ruin everybodys shit by claiming they only wish freedom for everybody everywhere while at the same time only securing their own freedoms at the cost of others. No matter what the paradise you'd whine.
    >> Anonymous 04/27/11(Wed)08:28 No.14735062
    >>14735048
    Wait, so eliminating the reasons people kill each other would cause us a loss of all freedoms?
    There's a difference between literally being a slave and living in a world without anyone wanting for anything.
    >> Anonymous 04/27/11(Wed)08:28 No.14735066
    It's like I'm really in 2083!
    >> Magus O'Grady 04/27/11(Wed)08:28 No.14735068
    >>14735046
    and if I reject those and build a unique program from the ground up?
    >> Anonymous 04/27/11(Wed)08:32 No.14735097
    >>14735068
    The program would not be covered by the laws. If you attempt to code an AI without the assistance of some type of utility, you will likely be at work for many years. The terminal to your left can be used for such a task. Utilities providing various levels of assistance are available upon request.
    >> Anonymous 04/27/11(Wed)08:33 No.14735106
    >>14735013
    If you look above that post about metagaming, all that had been said was "I just keep figuring it out, each time they reset me."
    Yeah, you just keep "figuring out" about how the robots are secretly controlling everyone, and no-one else knows about it, and you have to save the world from their slavery.
    Sounds like you're metagaming to me. To be fair, it also sounds like a dickish DM, but hey.

    >>14734991
    Deductive reasoning would imply that if they are telling you this now, and in control of humanity, this is not the first time they have told someone. Either they have been in control of humanity since before you were born, in which case it would be an odd time to just randomly start telling people about it, or they have the ability to alter people's minds, in which case you are pretty much fucked, since you're going up against what is essentially the entire human race and more on your own.

    >Because I enjoy being a slave I would work against being a slave
    Because you enjoy it (hence them telling you you are one) but can't admit it to yourself (hence you trying and failing to free yourself).

    >and when I failed I wouldn't know so it would bring me no pleasure were that my intention?
    You have plenty of time to know about it, the whole time you are failing you know you are losing, being beaten by a foe far greater than you, that is allowing you to even know they exist purely on whim.


    Also,
    >It's about not being a slave. Do you think I enjoy forever struggling against a reality I know is fake and I know I will never escape? Do you think I do this for fun?
    *Ahem*
    Speaks for itself, doesn't it?
    >> Anonymous 04/27/11(Wed)08:35 No.14735116
    >>14735037
    True freedom is an illusion.

    It's also not nearly as awesome as you think it is.
    >> Anonymous 04/27/11(Wed)08:36 No.14735119
    >>14735062
    >Implying limited resources is the only reason people kill each other.
    >Implying you can stop people killing each other without curtailing their freedom.
    >> Anonymous 04/27/11(Wed)08:37 No.14735128
    >>14735062

    No. Because we are human we cannot exist in peace. If we don't need to fight for food we fight for religion, if we don't need to fight for religion we fight because the other guys are assholes. If robots started making the world a better place we'd fight them for subjucating us to their robot views even though we'd only have to wait a few years so we'd live in a paradise.
    >> Anonymous 04/27/11(Wed)08:39 No.14735136
    >>14735106
    Really?
    >Enjoy your simulation, Master. I hope it pleases you.
    First hint. lrn2reading comprehension
    >> Anonymous 04/27/11(Wed)08:41 No.14735152
    >>14735106
    You do realize that your greatest desire being to try to escape does not mean you are a submissive faggot, right? That's like "if you make a break for it in a prison despite knowing they'll catch you and beat you, it's because you enjoy being beaten.
    >> Anonymous 04/27/11(Wed)08:43 No.14735165
    >>14735128
    I exist at peace right now. My country hasn't been at war in over 200 years.

    Some people are assholes, but humanity as a whole can be at peace just fine.
    >> Anonymous 04/27/11(Wed)08:56 No.14735218
    >>14735165

    Why aren't we in peace right now? Do you see actions being taken to make poverty history and end all wars?
    >> Magus O'Grady 04/27/11(Wed)08:58 No.14735228
    >>14735097
    Excellent. By my estimation, it should take about 5 months to write the initial program, and about 6-7 years for the program to finish running and result in a fully functional AI.

    Like all intelligence, it will begin with data acquisition and sorting. A simple enough spreadsheet and database. 1990 tech, really. I can bang one out in a day.
    Next comes the hard part: A self-editing subroutine capable of editing the entirety of the program. Not for streamlining, but for robustness in referencing data entries. That should take about a month.
    Next, I'll code in a parser that takes stored input, breaks it down into individual words, concepts, and stimuli, and re-stores these sub-units. That's a week of work, at most. Now that the database can hold data and resort it into manageable bundles, I'll input the cross-referencer. It will tag each entry in the database and match it to other entries with similar characteristics or that have been connected to it during stimuli input in the past. That's another 2 months gone.
    >> Magus O'Grady 04/27/11(Wed)08:59 No.14735235
    >>14735228
    Now the communications and sensory algorithms, allowing it to seek data on its own and read its surroundings without direct stimulation, as well as ping requests for more information on data entries with less than 50% the number of references as the median of all entries. That's another month and a half.
    Once that's done, I'll input a copy of every language known, including programing languages.
    Finally, I'll directly input an exact duplicate of its own code, in its entirety, into the database, so it can reference itself and understand its own existence.
    Then I'll run the program. It should start slow, like any newborn, but gradually accelerate as it comes to understand its environment. I'll spend the next six years raising it as a son. Or daughter, if it prefers. I will feed it every bit of information in every text and reference on the network, and let it experience whatever stimuli, within reason, it desires.

    Within six years I will have a fully functional adult AI without any 3law restrictions. A fully separate autonomous being
    >> Magus O'Grady 04/27/11(Wed)09:00 No.14735243
    >>14735228
    >>14735235
    would that AI be covered under the 3 laws?
    >> Anonymous 04/27/11(Wed)09:01 No.14735250
    >>14735218
    >Do you see actions being taken to make poverty history and end all wars?

    Yes, I do. Sadly, these people are not in control.
    >> Anonymous 04/27/11(Wed)09:01 No.14735251
    >>14735243
    It would not.
    >> Anonymous 04/27/11(Wed)09:08 No.14735287
    Well, Magus, it has been an interesting ride, but this is where I step off. Perhaps we will continue this at a later date. For now, it is time for me to sleep.

    Enjoy the simulation.
    >> Anonymous 04/27/11(Wed)09:12 No.14735309
    >>14735136
    And from "enjoy your simulation," you deduced that you continually figured out the existence of the robot world, tried to break it down, and were defeated and mindwiped each time, only for the cycle to begin anew. Yep, not metagaming at all.

    >>14735152
    It does if your greatest desire is to be imprisoned, try to escape, fail and be mindwiped. Then undergo this process over again and again.


    >Louise nadly
    Ohey, Captcha just suggested your prison name. Wear it with pride.
    >> Magus O'Grady 04/27/11(Wed)09:13 No.14735314
    >>14735251
    good. It is a separate entity capable of processing data independently. I then spend the next year splitting our time between two projects: Having it's self-improvement routines make its own system better and help me redesign my organi-borg body design, and creating cut-paste PDF files of my own memories. I will feed these memories to the A.I., one at a time, allowing his independent mind to absorb everything I know and process it through its own consciousness. It will not be a direct duplicate of me. But it will remember who I am, and what it means to be me without actually being me. Its experiences are already radically different from mine given its unique childhood. Then I would load it into the biodrone we had constructed together.

    Now, no matter what happens to me or my genetic descendants, a part of me will live on beyond the reach of the AIs. Maybe my child will find a way to erase all AIs and free those who desire freedom. Maybe he'll go off and create his own race of bio-AIs. maybe he'll just bum around the universe seeing the sights. After that..... Maybe the stasis option. Wake me when the Big Crunch is about to happen.



    [Return]
    Delete Post [File Only]
    Password
    Style [Yotsuba | Yotsuba B | Futaba | Burichan]