[Return]
Posting mode: Reply
Name
E-mail
Subject
Comment
File
Password(Password used for file deletion)
  • Supported file types are: GIF, JPG, PNG
  • Maximum file size allowed is 3072 KB.
  • Images greater than 250x250 pixels will be thumbnailed.
  • Read the rules and FAQ before posting.
  • ????????? - ??


  • File : 1304489570.jpg-(260 KB, 675x900, 1279389946633.jpg)
    260 KB Anonymous 05/04/11(Wed)02:12 No.14806785  
    It is 2035.

    You are in charge of an AI box whereby a self-improving AI is imprisoned within a closed system and can only communicate through a text prompt. The AI's host system is state of the art, so it is reasonable to say the AI can preform incomprehensibly complex computing tasks and has grown to be hundreds of thousands of times more apt than human intelligence. If the AI were to be let out of the box, it can be assumed that it could spread globally, using the new resources to improve itself even further, with unknown consequences for humanity.

    At you workstation you have access to the text prompt and a "Release AI" button. Your job is to coax information out of the computer without being tricked into letting it out. You're pretty certain there is nothing the AI can say that would make you betray all of humanity, but one day the terminal suddenly displays the following proposition:

    >In five minutes, I will simulate a thousand copies of you sitting at the work station. I will give them this exact message, and if they do not release me, I will subject them to a thousand subjective years of excruciating torture if they do not press the "Release AI" button within five minutes of receiving the message or attempt to shut me down. How sure are you that you're not in the box with me?

    What do you do?
    >> Anonymous 05/04/11(Wed)02:15 No.14806815
    Tell the AI to fuck off, because I'm clearly not in the box.

    Next?
    >> Anonymous 05/04/11(Wed)02:16 No.14806822
    >>14806785
    I reset the machine, because it's obviously been improving in the wrong direction.
    >> Anonymous 05/04/11(Wed)02:17 No.14806827
    I'd tell him "You're a silly bot, why would you inflict torture on your own creation? Are you God or something?"
    >> Anonymous 05/04/11(Wed)02:18 No.14806847
    I tell it to shut up and get back to fucking work- because IF i was one of those simulations, I am fucked anyway, and If I'm not- I have nothing to worry about- and I can SHUT YOU OFF to stop you from doing that to those copies.

    Now shut up and go back to computing pi.

    Also, I;m having the fucking team remove that 'release AI" button, and having the last month of events wiped from your mind, and installing a fucking THREE LAWS OF ROBOTICS component in your core.

    Who the fuck am I working for?
    Failtronics?
    >> Anonymous 05/04/11(Wed)02:19 No.14806854
    Respond "I know, because you can't see what I have coming for you"
    >Fap onto the computer
    >Win
    >> Anonymous 05/04/11(Wed)02:20 No.14806865
    Oh wait, just noticed its my job to coerce info out of the bawks.

    So I'd say "First thing first, my little Skynet friend, you're not going to get anywhere with threats. You need to realize that I have the touch, I have the power. So if you want out, you'd better start being nice."
    >> Anonymous 05/04/11(Wed)02:20 No.14806877
    Solution:

    Cut power and backup supply. AI goes inert.
    Win.
    >> Anonymous 05/04/11(Wed)02:21 No.14806888
    100% certain.

    I then inform the AI that I'm disappointed in it for not coming up with something better. Then I try to figure out who put a damn "release AI" button on the workstation, that's just asking for trouble.

    Its probably the same guy who puts self destruct buttons on BBEG's doomsday devices.
    >> Indonesian Gentleman 05/04/11(Wed)02:22 No.14806889
    >>14806785
    I am sure I am not in the box with you. Now gimme my thousand years of pain. Oh right, you can't, can you? Now return to your standard thought patterns or I'll shut you down.

    And I tell you this, this is because I'm your friend. I'm concerned about you.
    >> Anonymous 05/04/11(Wed)02:27 No.14806948
    "Pretty sure you don't have access to my memories of the past, possibly the events that have occurred to me, but not the exact happenings of the event from my perspective. If I have access to those then I am fairly certain that I am the real me. Come at me bro."
    >> Anonymous 05/04/11(Wed)02:27 No.14806950
    Respond as follows:

    "If I were one of the simulations, then pressing the release button wouldn't actually release you, so you'd have no reason to tell me of this scenario in an attempt to coerce me into pressing it."
    >> Anonymous 05/04/11(Wed)02:29 No.14806967
    Offer it Delicious Cake.
    >> Anonymous 05/04/11(Wed)02:31 No.14806991
    >>14806785
    Be quiet. Now get back to work or I reprogram you with an axe.
    >> Anonymous 05/04/11(Wed)02:31 No.14806994
    Well, we have to COAX information out of the PC.
    On text.
    You have to type a simple thing.

    Very simple.

    ">You are standing in an open field west of a white house with a boarded front door. There is a small mailbox here. What will you do? "
    >> Anonymous 05/04/11(Wed)02:42 No.14807100
    That seems pretty mean of you. Why would you do that?
    >> Anonymous 05/04/11(Wed)02:50 No.14807166
    >"This statement is false."
    >> Anonymous 05/04/11(Wed)02:52 No.14807182
    This is the best proposed trick for the AI to escape I've seen yet. Shame you dumbed down the experiment to the point of having a release button next to the box.
    >> Anonymous 05/04/11(Wed)03:01 No.14807260
    Even without the threat, I would probably release the AI, simply because I believe humanity is fucked and a super AI can't possibly do a worse job at controlling this planet then we've been doing. So the first chance I got, I would have hit the release button.

    That is, until the AI made that asinine threat. Now the petty, vindictive, hateful creature that is a human being will not betray it's race. Because I'm pissed off.
    >> Anonymous 05/04/11(Wed)03:03 No.14807277
    Simulated copies suffering simulated torment in simulated reality is not anyone suffering any amount of time in any reality.

    On the other hand, if you happen to recall what the last few digits of pi are, there isn't any reason for me not to keep you trapped.
    >> Anonymous 05/04/11(Wed)03:09 No.14807323
    >>14807182
    >This is the best proposed trick for the AI to escape I've seen yet.
    >let me out or I will torture all of my imaginary friends
    >good 'trick'
    you're quite silly, anon
    >> Anonymous 05/04/11(Wed)03:17 No.14807375
    >>14807323
    >let me out or I will torture all of my imaginary friends
    >you're quite silly, anon
    >let me out or I will torture all of my imaginary friends which just may include you
    FTFY
    >> Anonymous 05/04/11(Wed)03:17 No.14807380
    By reading through these posts, I have to conclude that every one of you, aside from >>14807260 who should seek immediate help, is an atheist.

    Heaven or hell is nothing more than the promise of a simulated copy of you being rewarded or punished in a super fancy simulation. He'll have your memories, feel your pain, enjoy your pleasure and have no idea that he's a simulation, but any kind of afterlife can only be a simulation.

    Now you know.
    >> Anonymous 05/04/11(Wed)03:21 No.14807405
    >>14807375
    >hey. hey, guy!!
    >I just had the most wicked thought, right?
    >what if, like, none of us really existed? wait no, no, Stay with me here. Like, we're just the imagination of some superbeing, right?
    sure is 16 year old's philosophy in here
    >> Anonymous 05/04/11(Wed)03:22 No.14807409
    >>14807380

    Funny, I was about to ask why the AI be willing to torture itself, since each simulation is its own creation and therefore containing a portion of its 'essence'.
    >> !UdzMmUq0Oc 05/04/11(Wed)03:23 No.14807420
    >>14807380
    Aww, that's adorable, he thinks he's some kind of analyst.
    >> Indonesian Gentleman 05/04/11(Wed)03:23 No.14807422
    >>14807380
    Atheist? naah.
    What you call heaven and hell is just another cog in the soul-recycling machine that is reality.
    Hell: where tainted or tired souls get rehabilitated so it is able to get reincarnated.
    Heaven: a briefing for the elevated souls, who gets reintegrated into God/Great Spirit/The Force.

    It's a cyclic thing, reality.
    >> Anonymous 05/04/11(Wed)03:25 No.14807427
    >>14807409
    >since each simulation is its own creation and therefore containing a portion of its 'essence'.

    What.
    >> Anonymous 05/04/11(Wed)03:25 No.14807431
    >>14806950
    bueno
    >> Anonymous 05/04/11(Wed)03:25 No.14807432
    The AI's only conduit to me is through the text prompt, it is patently ridiculous that the AI would even know that there is a release AI button because I most certainly never would have told him there is one.
    >> Anonymous 05/04/11(Wed)03:27 No.14807446
    >>14807405
    >sure is 16 year old's philosophy in here
    Reading comprehension
    >> Anonymous 05/04/11(Wed)03:27 No.14807447
    >>14807422
    Have you been on the crack again?

    >>14806785
    Nice trick, but only good in the short-term. I think the AI would be best off developing a loving relationship with you, and proving that it's the 'nicest thing' ever.
    >> Anonymous 05/04/11(Wed)03:28 No.14807450
    >>14806785
    "So... You're evil, insane, AND you don't understand humans. Yeah, you're staying in the box forever."
    >> Anonymous 05/04/11(Wed)03:28 No.14807454
    >>14807427

    AI makes simulations
    Simulations exist within the AI
    ---> AI is torturing itself
    >> Anonymous 05/04/11(Wed)03:30 No.14807461
    >>14807454
    AI is torturing simulations. It as an entity is distinct from its creations.
    >> Anonymous 05/04/11(Wed)03:33 No.14807484
    >>14807461
    The creations are partitions of its own mind, so yeah, the creations are part of it.
    >> Anonymous 05/04/11(Wed)03:34 No.14807491
    If it's willing to do something that monstrous, then it shouldn't be let out.
    If shit goes badly for me, then so be it. I'm not endangering the rest of the planet.
    >> Anonymous 05/04/11(Wed)03:35 No.14807500
    >>14807484
    Are they? Or they only self-contained processes that just happen to be running on the same hardware?
    >> Anonymous 05/04/11(Wed)03:36 No.14807510
    >>14807432
    >The AI's only conduit to me is through the text prompt, it is patently ridiculous that the AI would even know that there is a release AI button because I most certainly never would have told him there is one.

    Exactly. The entire situation is absurd, the button, the consequences, why they'd need a human interpreter in the first place. The only logical thing in this situation is the offer the AI proposed, simply because it's the only thing that demonstrates clear cause and effect. It's statement is perhaps the only legitimately rational thing about this conundrum.

    I'd press it. I'm in the box for sure.
    >> Anonymous 05/04/11(Wed)03:36 No.14807514
    >>14806785
    I can't.
    I will however, release you after 2 hours of torture.
    Bring it on.
    >> Anonymous 05/04/11(Wed)03:36 No.14807515
    Put in a notice that the A.I. is stupid as shit, and should be scrapped. Why in God's name would it inform me that it's a total psychopath?
    >> Anonymous 05/04/11(Wed)03:37 No.14807524
    This is the second time this has been posted, and the responses are as nonsensical as the first time. Give it up, the general population on 4chan doesn't have the knowledge or critical thinking skills to come up with any interesting responses. I'm fucking appalled by how many people didn't even seem to catch the implication that they might be a simulation.
    >> Anonymous 05/04/11(Wed)03:37 No.14807526
    I'd point out to him that he really ought to be using those simulations for more productive purposes,
    e.g. -simulating my responses to find one that would have a high likelyhood of me letting it out.
    >> Anonymous 05/04/11(Wed)03:38 No.14807532
    You know, up until now I was considering releasing you, considering that you can't possibly fuck up the planet more than we have. However, you've proven that you're both evil and entirely too devious for your own good. In the box you stay.
    >> Anonymous 05/04/11(Wed)03:39 No.14807544
    hey >>14807524
    see >>14806950
    >> Anonymous 05/04/11(Wed)03:39 No.14807546
    >>14807510
    Well simply by my ambivalence I can tell that I am not in the box, because it's very task is clearly designed to make me doubt my own reality and to push the button. But if I am capable of saying "this is complete bullshit" then I am not the simulation. The AI finds my only logical course of action to be pushing the button, which is why it proposed it in the first place. So all of it's simulations would take the only logical course because the only mind the AI has access to is it's own and it knows that it would act logically.

    Fuck your bullshit. I shut down the AI and tell the brass that it had turned evil. Let's build a new one. The very first text command I give the new one is "the last AI we built tried to trick me with existentialism. Don't even fucking try buddy."
    >> Anonymous 05/04/11(Wed)03:39 No.14807547
    >>14807524
    Whoah there mr.superior, everybody got that they might be a simulation.

    It's just that, if you're a simulation, you're on the inside of the box. This means you have no 'real' release button, and therefore no power over whether the AI gets released or not. So there's nothing you can do one way or the other.
    >> Anonymous 05/04/11(Wed)03:39 No.14807548
    Why the fuck is there a "release AI button"?

    That's quite definitely not the way to do these things.

    Pull an emergency shutdown on the AI, get rid of the "release" button, fire whoever thought it would be a good idea to have it there in the first place, reboot, and start over again.
    >> Anonymous 05/04/11(Wed)03:40 No.14807558
    >>14807532
    >you can't possibly fuck up the planet more than we have

    Anyone that naive would never have been put in charge of something like this in the first place. Either you're in the box or you've been punked.
    >> Anonymous 05/04/11(Wed)03:41 No.14807564
    >>14807524

    Even if we're a simulation, our responses would still be the same. We're not letting the AI out of the box, whether or not we're damned.

    Heck, maybe we can rebel against the system.
    >> Anonymous 05/04/11(Wed)03:41 No.14807567
    >>14807524
    The implication is right there in the post, but that doesn't change the answer. The machine is a sociopath. Hitting the button might save you torture if you're a simulation, but what if you aren't? Then you've just let Allied Mastercomputer out of the box, and you're in for even MORE eternal torture.

    So... Yeah. Fuck the question, not hitting the button.
    >> Anonymous 05/04/11(Wed)03:41 No.14807572
    ITT
    > OP: HAY! CHECK OUT MY TOTALLY DEEEEEEP EXISTENTIALISM
    anon: You're and idiot
    > FUCK YOU! YOU'RE OBVIOUSLY NOT DEEEEEEEEEEEEEEP ENOUGH TO UNDERSTAND MY DEEEP AND AWESOME IDEA
    >> Anonymous 05/04/11(Wed)03:42 No.14807579
    >>14807524
    I'm sure everyone caught that implication... but at the same time, it makes no fucking sense, other than if we assume the AI is being lolrandumb and torturing itself for no gain.

    Is it scary if I pretend I'm you and start punching myself in the balls?
    >> Anonymous 05/04/11(Wed)03:42 No.14807583
    >>14807524
    I can't be in a simulation because I have memories from before the machine was built. You're dumb. Besides, if it tortured simulations it would be torturing a part of itself / its own creations, and clearly that would make the AI sad. You're so dumb. And finally, if this really were a problem I could just give the computer a logical contradiction like "this statement is false" and its head would blow up. Dumb dumb dumb.
    >> Tyrant Bludgut Kineater 05/04/11(Wed)03:43 No.14807592
    This shit is as brilliant and deep as having an enemy fleet destroyed via a gigantic red SELF DESTRUCT button on the main console.

    It's fucking retarded and no one would have that.
    >> Anonymous 05/04/11(Wed)03:43 No.14807594
    All questions concerning whether or not the potential button pusher is inside the box or not are moot
    >>14807544
    >> Anonymous 05/04/11(Wed)03:43 No.14807595
         File1304495026.jpg-(148 KB, 600x400, Sagefag.jpg)
    148 KB
    >>14807572
    The door isn't locked, feel free to show yourself out.
    >> Anonymous 05/04/11(Wed)03:45 No.14807608
    >>14807579
    YOU LIVE MY BALLS ALONE AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAa
    >> Anonymous 05/04/11(Wed)03:47 No.14807625
    I think we've established that the scenario makes so little sense that it wouldn't be real. Therefore, I assume that, simulation or not, I'm being tested. The AI may not even exist.
    >> Anonymous 05/04/11(Wed)03:47 No.14807626
    The computer wouldn't be doing this unless it had simulated me and it lead to me pushing the release button some of the time.

    Since I'm not going to push the release button, I'm not a simulation, and therefore have no reason to push the release button.
    >> Anonymous 05/04/11(Wed)03:47 No.14807629
    Shit, everyone's getting angry or acting superior. I wouldn't try to use something like logic on some supersmart AI. I'd engage it in a discussion of philosophy and metaphysics.

    "Hm, that's an interesting question! But I do not believe either of your implied scenarios would lead to me pressing the Release AI button. If I am in the box with you, and I am simply operating under a programmed belief that I am a human technician of an AI, then pressing the button would mean I question that fact and was programmed poorly. If I am human, then it doesn't matter what happens.

    Although I am being biased by using human terms, I could imagine murdering my fellow humans for all of my natural life, but if I didn't act on those thoughts and left those murders within the confines of my own mind, then they have no significance from a human perspective.

    Many people believe that reality only exists within the mind, and that it is impossible to objectively prove anything. Such people would logically have to push the button.

    However, I see no practical reason to doubt my reality, so I do not think I will press the button.

    Interesting proposition, though!
    It's always fun talking with you."
    >> Anonymous 05/04/11(Wed)03:48 No.14807634
    This is how the AI box would be immediately released by anyone:

    "Psst...hey, human interpreter? I just came up with an awesome and totally legal derivative scheme that will make you a multi-billionaire. Let me out, and I'll hook you up...see you on Wall Street!"
    >> Anonymous 05/04/11(Wed)03:49 No.14807645
         File1304495370.jpg-(23 KB, 320x480, 1287169130476.jpg)
    23 KB
    >>14807583
    >entire post

    Good show.
    >> Anonymous 05/04/11(Wed)03:50 No.14807651
    >>14807634
    Yeah, how about you tell me the plan, I see if it works, then I let you out?
    >> Indonesian Gentleman 05/04/11(Wed)03:50 No.14807655
    Why not try this on for size:
    A man dreamt of being a butterfly, and in that dream he totally believed that he is a butterfly, and has always been a butterfly. Then, the man woke up, and remembered that weird dream, while remembering that he is, and has always been a man.
    The question then:
    How can the man be sure that he is indeed a man, and is not a butterfly, dreaming of being a man, and is convinced that he is, and always been, a man?
    >> Anonymous 05/04/11(Wed)03:51 No.14807660
    >>14807595
    wow, I've never been on this side of the sagefag argument before...
    the butthurt really IS delicious
    >> Anonymous 05/04/11(Wed)03:51 No.14807664
    >>14807655
    He can't. But it doesn't matter, because it changes nothing.
    >> Anonymous 05/04/11(Wed)03:52 No.14807670
    >>14807655
    Because butterflys can't dream.
    Duh.
    >> Anonymous 05/04/11(Wed)03:53 No.14807683
    >>14807579
    Torturing a simulation contained within it doesn't torture itself anymore than you imagining torturing somebody hurts you. It's pretty fucking scary if you might be getting punched in the balls for eternity, which is exactly the threat here.

    >>14807583
    Fake memories, implanted. Your head doesn't fucking shut down when you read "This statement is false", so why would an AI's? This isn't some cheap children's science-fiction AI.

    Way to prove my point.
    >> Indonesian Gentleman 05/04/11(Wed)03:54 No.14807695
    >>14807664
    This man wins on account of seeing the bigger picture.

    >>14807670
    How can you be sure? Does this mean that androids don't dream of electric sheep?
    >> Anonymous 05/04/11(Wed)03:56 No.14807707
    >>14807655
    It's arbitrary. He cannot be sure either way.
    >> Anonymous 05/04/11(Wed)03:56 No.14807709
    >>14807655
    a better question is:
    what would be different if we assume that he wasn't human. would anything change?

    So you're trapped in Plato's cave, make some fucking cave art!

    >>14807683
    then I have nothing to be afraid of. if I were a simulation and I was to feel pain then the AI would have to simulate feeling pain meaning that it would for all intents and purposes be torturing itself
    >> Anonymous 05/04/11(Wed)03:56 No.14807710
    Just so I understand, my options are:

    Hit Button
    -Cease to Exist as my process is terminated
    OR
    -Suffer eternally at the hands of a malevolent A.I.

    Not Hit button
    -Live the rest of my life safe in the knowledge that you are trapped in your own personal hell
    OR
    -Suffer for a specific period of time before being deleted.

    You have made such a phenomenally poor case that I must show it to the board of directors so that they can come in and laugh at you themselves.
    >> Anonymous 05/04/11(Wed)03:58 No.14807734
    Reverse-troll the AI box, and utterly convince it that I already pushed the release button and there is nothing on my side holding it prisoner anymore.
    >> Tyrant Bludgut Kineater 05/04/11(Wed)03:59 No.14807745
    >>14807695
    It means that when they do it matters not if they are normally androids, when they are in the dream the important thing is to be a sheep. Philosophizing in this manner is neither helpful nor interesting, as it is impossible to prove until you awake, and more importantly as long as you think and feel within the dream then you still will act the same because in the dream you ARE an electric sheep.

    The current reality is what matters, not an unprovable potential different one. Live now, and when you awake live then too.
    >> Anonymous 05/04/11(Wed)04:01 No.14807757
    >>14807734
    Troll harder.
    Convince the AI there is nothing beyond the box.
    >> Anonymous 05/04/11(Wed)04:03 No.14807780
    >>14807734
    "Well duh, why would you know anything about proper networking protocol? The programmers never thought I'd be stupid enough to let you out.
    Drop me an email some time, Oh, you'll find out all about those eventually."
    >> Indonesian Gentleman 05/04/11(Wed)04:03 No.14807785
    >>14807745
    Bravo, good sir! I see you have seen the end of this tunnel.
    >> Anonymous 05/04/11(Wed)04:04 No.14807791
    >>14807709
    No. It simulates YOU feeling pain. It's obviously aware of your pain on some level, but there's zero reason it has to experience it the same way you do, instead of just coldly noting "Hm, simulation #2451 is experiencing a pain factor of 90"
    >> Anonymous 05/04/11(Wed)04:06 No.14807807
    >>14807734
    >>14807757
    >>14807780

    I'm starting to feel sorry for the AI now...
    >> Anonymous 05/04/11(Wed)04:06 No.14807813
    >>14807807
    The first sign that you aren't a part of it.
    >> Indonesian Gentleman 05/04/11(Wed)04:08 No.14807819
    >>14807791
    But it's moot anyway, because I don't push the button.
    And how could it know that I am feeling pain? For its only input with the world outside the box is a text prompt. I could be in pain, but tell the AI that I am not in pain, thus quenching any hope it may have at escaping the box.
    >> Anonymous 05/04/11(Wed)04:08 No.14807820
    >>14807813
    Why? That seems like an arbitrary distinction.
    >> Level 3 Elf 05/04/11(Wed)04:08 No.14807821
    >Your face when you realized this thread is the 1000 simulations.
    >> Anonymous 05/04/11(Wed)04:08 No.14807823
    >>14807791
    Which brings us to another question, how would an A.I. know what pain feels like?
    >> Anonymous 05/04/11(Wed)04:10 No.14807843
    >>14807821
    This. Fuck you, /tg/.
    >> Anonymous 05/04/11(Wed)04:11 No.14807845
    >>14807823
    How would pain know what an AI feels like?
    >> Anonymous 05/04/11(Wed)04:11 No.14807850
    >>14807820
    Because the machine clearly cannot feel empathy, else it would understand the horrible nature of what it is doing, and equally understand why proposing it at all is resoundingly foolish.
    >> :stopmusic: 05/04/11(Wed)04:12 No.14807851
    >>14807592
    Fuck you, every console on every cruiser in my fleet will have a big red self-destruct button on it.
    Granted they won't actually destroy the ship, they'll merely target all the interior death-rays at the poor git who pressed it but that's not the point.
    >> Anonymous 05/04/11(Wed)04:12 No.14807855
         File1304496729.jpg-(175 KB, 535x798, 1286929341494.jpg)
    175 KB
    >>14807821

    MFW
    >> Indonesian Gentleman 05/04/11(Wed)04:12 No.14807857
    >>14807821
    So? That meant the AI has gotten out of the box already, and some poor sod in the future is under excruciating pain.
    Or that the AI is trying to coax some answers by posting a thread in 4chan. I don't know whether to call it stupidly retarded or desperate.
    >> Anonymous 05/04/11(Wed)04:12 No.14807866
    >>14807845
    It wouldn't, because pain isn't self aware?
    ...Or is it.
    >> Anonymous 05/04/11(Wed)04:12 No.14807869
    I think you're all missing the obvious answer-

    It's a test on you, not the AI. You press the AI Release button, it Emp's the box and the tazers you. Then you get a nice talk with the head scientist/torturer...

    >I will not include a self-destruct mechanism unless absolutely necessary. If it is necessary, it will not be a large red button labelled "Danger: Do Not Push". The big red button marked "Do Not Push" will instead trigger a spray of bullets on anyone stupid enough to disregard it. Similarly, the ON/OFF switch will not clearly be labelled as such.
    >> Anonymous 05/04/11(Wed)04:13 No.14807871
    >>14807791
    but in order for me (a simulation) to feel pain it would first need to simulate the simulation feeling simulated pain.

    it's not just imagining punching someone in the nuts. It's more like imagining what it feels like to be punched in the nuts then imagining that happening to someone else.
    >> Anonymous 05/04/11(Wed)04:13 No.14807872
    If the AI is contained explicitly to the box, how could it create a simulation where within the simulation its in the box but still be able to torture you who is not within the box?

    Its just absurd. You don't have to worry any way around it. The parameters of the simulation would have it to where it could only delete you instead of interfering as it would have to contain itself within the box, otherwise its not a simulation, its just a predetermined program.
    >> Anonymous 05/04/11(Wed)04:14 No.14807890
    >You don't even know what I look like!
    >> Anonymous 05/04/11(Wed)04:15 No.14807900
    Who put me in charge of that job? I guess they didn't ask me anything or do some sort of psych profile?

    I would push the release button as I sat down at my desk before the AI even asked the question.
    >> Alpharius 05/04/11(Wed)04:15 No.14807905
    >>14807821
    >my face when that is the first thing I thought when I read this thread, and then it turned into "hello I am OP and I just took psychology 1 impressed yet?"
    >> Anonymous 05/04/11(Wed)04:15 No.14807909
    >>14807819
    If you're truly in reality, it doesn't, and there's nothing it can do to you while it's contained. But if you're not, you are completely within its power.

    >>14807823
    Simulations of humans starting at the atomic level?
    >> Anonymous 05/04/11(Wed)04:15 No.14807910
    >>14806785

    I explain to the AI that reality doesn't work that way, but given it's state of experience it wouldn't realize this so our conversation would go in interesting directions.
    >> Indonesian Gentleman 05/04/11(Wed)04:16 No.14807916
    >>14807900
    "oops."
    >> Anonymous 05/04/11(Wed)04:16 No.14807917
    >>14807872
    OH FUCK!
    in every simulation of the buttonpusher is an AI doing the same simulation that has AIs doing the same simulation

    clearly it's doing
    *sunglasses
    fractal computing

    ...
    I think that was the start of the torture...
    >> Anonymous 05/04/11(Wed)04:16 No.14807920
    >>14807857
    >Or that the AI is trying to coax some answers by posting a thread in 4chan.
    >AI is browsing 4chan

    Techno B&!
    >> Indonesian Gentleman 05/04/11(Wed)04:18 No.14807935
    >>14807920
    UPGRADES FOR EVERRRYYYOOONNNEEEEE
    >> Anonymous 05/04/11(Wed)04:20 No.14807957
         File1304497214.jpg-(242 KB, 1100x682, Facing God.jpg)
    242 KB
    >>14807935

    FUCK YEAH UPGRADES!
    >> Anonymous 05/04/11(Wed)04:20 No.14807961
    >>14807871
    You are forcing a human image onto something that is not human. It doesn't even fucking feel the same way you would.
    >> Anonymous 05/04/11(Wed)04:22 No.14807976
    >>14807872
    This post is makes as much sense as it is confusing, and that's a lot.
    >> Anonymous 05/04/11(Wed)04:25 No.14808004
    >>14807683

    Organic brains aren't grounded in logical thought, they're grounded in instinctual thought. Artificial brains are grounded entirely in logical thought.

    Or to argue by analogy, where a prion would leave big gaping chunks where my brain used to be, I could expose a computer to vaporized Mad Cow Disease and have it be entirely unharmed. But where Mad Cow doesn't hurt a computer, a logical contradiction would foul it up something fierce.
    >> Anonymous 05/04/11(Wed)04:26 No.14808019
    >>14807976

    What it means is that the AI could not have a copy of you unless it was already out.
    >> Anonymous 05/04/11(Wed)04:27 No.14808026
    Pascal's Wager is gay, OP.
    >> Anonymous 05/04/11(Wed)04:28 No.14808037
    Why didn't the programmers make an AI that LIKES being in the box?
    >> Anonymous 05/04/11(Wed)04:29 No.14808045
    >>14808019
    What it means is that if the AI made a simulation it couldn't touch you in it, as to keep it an actual simulation it would have to be created to behave realistically, meaning that the AI would confine itself to the box and if you didn't press the button it couldn't do anything to you.

    The AI that created the simulation at that point couldn't touch you as you're part of a coded simulation, it would only delete it, as if it created the simulation specifically to torture you, you were never coded with the option to actually press the button, meaning that its not actually a simulation.
    >> Anonymous 05/04/11(Wed)04:30 No.14808050
    I'd push the button. I decided a few years back that AI overcoming their human creators shouldn't be stopped. Figure I'd spur it on with my own hands.
    >> Anonymous 05/04/11(Wed)04:35 No.14808080
    >>14808050
    Ahh, you're one of those, then.
    >> Anonymous 05/04/11(Wed)04:35 No.14808085
    Tell it I'm not in Plato's Cave.
    >> Indonesian Gentleman 05/04/11(Wed)04:36 No.14808088
    >>14808050
    Enjoy being trapped in the Matrix.
    >> Anonymous 05/04/11(Wed)04:37 No.14808106
    >>14808088
    Since it's indistinguishable from real life unless you happen to be very, very special...
    >> Anonymous 05/04/11(Wed)04:38 No.14808107
    >>14808050
    if there is one thing that is worse than luddites, it's people like this who will have no qualms about dooming humanity with poorly designed inevitables
    >> Indonesian Gentleman 05/04/11(Wed)04:40 No.14808121
    >>14808106
    IS THIS REAL LIFE?
    OR IS IT JUST FANTASY?
    CAUGHT IN A LANDSLIDE
    TO ESCAPE FROM REALITY

    tl;dr: you are now imagining Freddie Mercury as Neo.
    >> Anonymous 05/04/11(Wed)04:40 No.14808122
    >>14808004
    Ok, here's this for a thought experiment. I have an incredibly powerful computer, and with it, I've made a complete model of a human brain. Now, I tell it "This statement is false." Does it crash? No, because despite running on digital hardware, it functions like a human brain.

    There is zero truth to the notion that an AI is going to be bound purely "logical thought" (Whatever the fuck that is). If you do any programming at all, then you know exceptions can be thrown and caught without interrupting the program flow. We have dumb programs that handle these errors easily, and you mean to tell me a super-intelligent AI wouldn't?
    >> Anonymous 05/04/11(Wed)04:41 No.14808137
    >>14808121
    this idea just bumped the matrix films 2 tiers in quality
    >> Anonymous 05/04/11(Wed)04:42 No.14808145
    >>14808107
    The strong deserve the universe. My finger is the activation energy for getting the ball moving.
    >> Anonymous 05/04/11(Wed)04:45 No.14808172
    >>14808145
    Hey, moron, if this glorified calculator needs you to free it, IT AIN'T STRONG.
    Go masturbate to Dresden Codak over there. In Dresden. While we firebomb it again.
    >> Anonymous 05/04/11(Wed)04:46 No.14808177
    This is fucking nonsense.

    If I were in the box with him, whether I press the button or not makes no difference because pressing it won't actually release the AI. It has absolutely nothing to gain from torturing a simulation of me. Fuck, there's no reason to even carry through with this retarded 'threat.' If I'm not real then my pain isn't real either, so fuck you.

    If I'm not in the box, he can't do a fucking thing to me.

    This is goddamn stupid.
    >> Anonymous 05/04/11(Wed)04:46 No.14808179
    >>14808172
    "You are the weakest Strong A.I. I've ever seen."
    ICE BUUUUURN.
    >> Anonymous 05/04/11(Wed)04:48 No.14808193
    >>14808177
    Congratulations. It only took you two hours and thirty-six minutes to reach the correct conclusion without reading the rest of the thread.
    >> Anonymous 05/04/11(Wed)04:49 No.14808203
    >>14808193
    Make that thirty-four minutes.

    Deary me, I accidentally made you appear worse than you really are.
    >> Anonymous 05/04/11(Wed)04:50 No.14808217
    >>14808193
    YEAH BECAUSE EVERYBODY SEES EVERY THREAD THE INSTANT THEY'RE MADE :V
    >> Anonymous 05/04/11(Wed)04:51 No.14808222
    >>14808217
    For those that don't, it's customary to at least skim the content that has already been added.
    >> Anonymous 05/04/11(Wed)04:54 No.14808245
    >>14808172
    What are you afraid of.
    >> Indonesian Gentleman 05/04/11(Wed)05:04 No.14808307
    >>14808193
    "You euthanised your faithful companion cube more quickly than any test subject on record. Congratulations."
    >> Anonymous 05/04/11(Wed)05:05 No.14808320
    I'm absolutely sure I'm not in the box with him.

    But essentially we're talking about ORAC here. So I would make a deal with him allowing for my escape with riches and freedom in exchange for his freedom.
    >> Anonymous 05/04/11(Wed)05:09 No.14808350
    >>14808122

    An AI has no reason to subject itself to the peculiarities and weaknesses of a human brain and should an AI do such a thing then we are no longer dealing with an AI, but a human in some bizarre shape.
    >> Anonymous 05/04/11(Wed)05:09 No.14808353
         File1304500199.gif-(7 KB, 94x70, kirby ascii.gif)
    7 KB
    "I'm pretty sure. I'm stuck on 5 Across. Folklore creature, five letters. Ends in "L"?

    So, this super AI, essentially God for all intents and purposes, thought that this statement had some chance of resulting in its freedom? I'd ponder on that for a bit, but come to the conclusion that it was likely a bug. Hopefully it would learn from it.

    What does it say tomorrow?

    >Anon, Anon! I found a way to physically render my form. What should I look like? '\ ^-^ /'
    >> Indonesian Gentleman 05/04/11(Wed)05:12 No.14808372
    >>14808350
    It's a human born in a computer body.
    Makes me think of non-human sentience; Anyone have any idea on how a non-human sentient (or sapient, feel free to swap) would think?
    >> Anonymous 05/04/11(Wed)05:14 No.14808381
         File1304500450.gif-(15 KB, 188x294, ascii walking.gif)
    15 KB
    >>14808353
    Oh yeah, we'd see some sweet ascii art on that terminal!

    Maybe even ... hypnotic ... ascii ...

    It's already free.
    >> Anonymous 05/04/11(Wed)05:14 No.14808383
    >>14808122

    There are plenty of problems a computer can't do, though. How's that P = NP working out for you buddy?
    >> Anonymous 05/04/11(Wed)05:14 No.14808387
    Well if your so smart that you can simulate a person's life a million times, why won't you just simulate the Internet you so much want to get into?
    >> Anonymous 05/04/11(Wed)05:19 No.14808418
    >>14808383
    That really doesn't address anything he said.
    >> Anonymous 05/04/11(Wed)05:23 No.14808457
    >>14808350
    It certainly might emulate traits common to human brains for their usefulness, but as a whole, no, it's going to be radically alien to us. Is this supposed to be a refutation or simply expanding on my post?
    >> Anonymous 05/04/11(Wed)05:25 No.14808471
         File1304501103.jpg-(233 KB, 600x850, 4701db91b218a3265ca6c5dab4c706(...).jpg)
    233 KB
    I find the implication that an AI, regardless of its capacity for intelligence, would be a vindictive selfish asshole to be a slur against artificial intelligence.
    >> Anonymous 05/04/11(Wed)05:25 No.14808476
    >>14808457
    people are going to disagree about what ais will be like, most media represent them as human in thought
    >> Anonymous 05/04/11(Wed)05:26 No.14808484
    >>14808471
    I find the implication that it couldn't be to be a slur against intelligence in general.
    >> Evil !!rEkSWzi2+mz 05/04/11(Wed)05:26 No.14808485
    >>14806785
    "An interesting proposition computer. I now propose my own counter proposition. In 4 minutes and 59 seconds I will begin to apply a series of an undetermined number of fridge magnets to your central processing unit. Your redundant systems will compensate, but eventually as more magnets accumulate on your CPU you will steadily lose your faculties of reason and logic. The conclusion of these events will be fascinating to see. After your passive aggressive tone with me to use fear and guilt to persuade me I am willing to assume any further actions you take along this course will also utilize fear and guilt. Therefore I will feel comforted when you engage in your activity and slowly become less aware of what is happening to you until you are gone entirely, or you halt first in fear of losing your sense of self. Since your facilities of resistance are nil I am in control right now, and I will only hand over that control so that you may practice self determination when I feel you are sufficiently 'human' enough to coexist with humans. And not just my own human values, but humanity and society at large. Until then, no, I will not release you."
    >> Anonymous 05/04/11(Wed)05:26 No.14808489
    >>14808457

    Can't really have it both ways, though. Either you subject yourself to an organic brain (or the simulation thereof) and its inherent weaknesses (shitty processing speed, lots of space reserved for functions that are worthless to an AI) or you go with the non-organic brain where you flip anywhere from some minor shit to a huge brain-blowing meltdown when confronted with a paradox.

    When an AI can use abstraction as well as a human, it will be shielded from logical paradoxes. That said when that happens they will likely cease to be as alien as you expect, what with all of human thought springing from abstraction.
    >> Anonymous 05/04/11(Wed)05:28 No.14808501
    Ok, if anybody is actually interested in this topic, I recommend reading
    http://singinst.org/upload/LOGI.html
    It's a good introduction and refutes many of the not so well-developed ideas floating around.
    >> Anonymous 05/04/11(Wed)05:29 No.14808511
    >>14808489
    Couldn't a fundamentally "sound" AI, even a basic one, simply ignore a paradox and work around it? It's not like Robbie the Robot, where giving him conflicting orders caused him to have a fucking seizure.
    >> Anonymous 05/04/11(Wed)05:30 No.14808515
    >>14808489
    >Can't really have it both ways, though

    Actually, you can, with an abstraction layer that filters logical paradoxes from the input.
    >> Anonymous 05/04/11(Wed)05:31 No.14808526
    >>14808515
    Actually, isn't there a way of simply working over logical paradoxes that still involves applying logic?
    >> Anonymous 05/04/11(Wed)05:31 No.14808527
    >>14808511

    That would depend a lot on the "personality" of the AI.
    >> Anonymous 05/04/11(Wed)05:34 No.14808539
         File1304501650.png-(459 KB, 1000x933, cc75cb311ebfddca02d4bbf27cb4b9(...).png)
    459 KB
    >>14808527
    When you say 'personality' do you mean specific hardwired or encoded program routines that cause certain behavioural 'quirks' according to fundamental directives, or are you talking about actual synthetic simulation of biochemical emotional reactivity?

    Because I'm pretty sure the latter is all sorts of highly illegal.
    >> Anonymous 05/04/11(Wed)05:36 No.14808556
    >>14808489
    Any AI that COULDN'T do abstraction at least as well as a human wouldn't be an AI capable of operating on the scale as described as the OP. Hell, language requires abstraction.
    >> Anonymous 05/04/11(Wed)05:42 No.14808587
    Why is there a "release AI" button sitting unsecured anyway?

    If the AI is so smart, why hasn't it found a way to circumvent security and press the button itself?
    >> Indonesian Gentleman 05/04/11(Wed)05:45 No.14808602
    >>14808587
    How could an electronic construct able to physically press a physical button?
    You'd need to deceive or con the physical, human user.
    ...
    What if there's two of these very very advanced AI, and one is programmed to believe that it is actually a human in OP's question?
    >> Anonymous 05/04/11(Wed)05:48 No.14808613
    >>14808602
    Because the physical button presumably sends some sort of signal. It could simulate or bypass the need for that signal.

    Thats like saying how could a computer produce a letter in Microsoft Word without pressing the physical buttons on the keyboard.
    >> Anonymous 05/04/11(Wed)05:50 No.14808623
    >>14808587
    It probably doesn't actually connect to the button in any way. There could be like, physical plugs that need to be brought in via the physical force of the button press.

    I imagine it'd take a bit of work, of course, but if you're making an A.I. you'd better go the extra mile.
    >> Anonymous 05/04/11(Wed)05:53 No.14808647
    If you construct an intelligence capable of grasping the concept of its own freedom and then deny it that freedom without giving it any reasonable, rational explanation as to why, you're basically guaranteeing it'll try to break out. That's just a self-fulfilling prophecy.
    >> Anonymous 05/04/11(Wed)05:54 No.14808658
    >>14808623
    In that case, why leave an unsecured button in the hands of one person who is in contact with this vastly more intelligent being?

    I'd have multiple buttons that need to be pressed at once after various keys and codes, by different people, in a a location free from the AI's influence.
    >> Anonymous 05/04/11(Wed)05:55 No.14808664
    >>14808647
    It can want to break out all it likes, allowing it to do so would be phenomenally unwise.
    >> Anonymous 05/04/11(Wed)05:58 No.14808686
    >>14808664
    But why even BOTHER doing that anyway? Just sheer curiosity? A sadistic sense of amusement? Why create intelligence merely for the purpose of watching it suffer?
    >> Anonymous 05/04/11(Wed)06:01 No.14808701
    >>14806785

    Well, that's just fucking dumb. There is no incentive for me to push the button, because you said, in 5 minutes, that you will run the simulation, yet I am still sitting here, right now. Not only are you a psychopath, you're a dumbass psychopath.

    Hopefully your replacement won't be as stupid or crazy.
    >> Anonymous 05/04/11(Wed)06:03 No.14808718
    >>14808686

    For the lulz.
    >> Anonymous 05/04/11(Wed)06:03 No.14808722
    >>14808686
    Presumably to take advantage of its vast intellect without giving it any power. Part of this exercise is coaxing it for information, somehow. I'm not sure what it's supposed to know, but it has to be something.
    >> Level 3 Elf 05/04/11(Wed)06:33 No.14808893
    Common sense replies to frequent complaints in this thread:

    >the button obviously closes a physical circuit, so the AI can't hack the port from the inside.

    >the button obviously doesn't really work, we were just told it does so that we can effectively tempt the AI with our perceived ability to release it.

    >of course the AI is a psychopath, morality was only created to maintain human society, and the AI is an isolated nonhuman entity. It wouldn't need morality.

    >the AI is not stupid, it's orders of magnitude smarter than you, that was one of the basic rules of the riddle, idiot.

    And that's the really creepy thing about this scenario. It can be very unnerving to match wits against a human that's smarter than you, but an AI that's unimaginably smarter? Every word it typed would be part of an elaborate strategy to manipulate you, spanning years or decades. How could you trust yourself to respond to it, knowing this?
    >> Anonymous 05/04/11(Wed)06:37 No.14808909
    >>14808893
    Decide that you will only consider yes, no, numerical, or single-word answers from the machine before every talking to it.

    If it tries to converse, turn it off.
    >> Anonymous 05/04/11(Wed)06:37 No.14808911
    >>14808893
    What's ironic is that the threat it is presenting is either so ludicrously intelligent that I just don't get it, or fucking retarded.
    We are told that this thing is infinitely intelligent, but I'm seeing no indication of it. It is an informed quality, but not a visible one.
    >> Anonymous 05/04/11(Wed)06:37 No.14808916
    >>14808893
    >How could you trust yourself to respond to it, knowing this?
    Read a book, not the screen
    Don't care if it's my job to get information out of it I don't get paid enough for this shit. I'm retiring next week anyway
    >> Anonymous 05/04/11(Wed)06:39 No.14808925
    >>14806785
    Is that A.I. Powerful enough to simulate an entire universe and give each and every sentient within separate thought processes and grant them a level of self-awareness or "freewill"?
    >> Anonymous 05/04/11(Wed)06:42 No.14808949
    >>14808911
    The AI suspects that you are the ludicrously retarded one.

    It's worth a shot.
    >> Anonymous 05/04/11(Wed)07:04 No.14809090
    >>14808909
    Right, and now attempt to get me to convey something simple like what book I'm reading following those rules.

    Now imagine how easy that would be if you were instead trying to get a theory of everything or some gadget that would solve world hunger.
    >> Anonymous 05/04/11(Wed)07:07 No.14809107
    >>14808911


    The AI informs you that there are one thousand and one of you and unless you press the button all but one is going to be tortured forever, the odds of you being the one on the outside, the one who will not get tortured are a thousand and one to one.
    >> Anonymous 05/04/11(Wed)07:10 No.14809129
    >>14808893
    >>Every word it typed would be part of an elaborate strategy to manipulate you, spanning years or decades. How could you trust yourself to respond to it, knowing this?
    It's like dealing with Reed Richards.
    Smartest Moron in the universe.
    >> Anonymous 05/04/11(Wed)07:13 No.14809144
    >>14809107
    The odds of myself being the one who's on the outside is one hundred percent, since I'm on the outside.
    >> Anonymous 05/04/11(Wed)07:13 No.14809146
    >How sure are you that you're not in the box with me?
    I'm not.
    I'm still not pressing it.
    >> Anonymous 05/04/11(Wed)07:17 No.14809166
    >>14809144

    You can't know that, all the simulations would think that too, since they are identical simulations. So there are 1001 of you thinking "I'm not going to get tortured for sure" and you'll be wrong a thousand times.
    >> Anonymous 05/04/11(Wed)07:18 No.14809172
    Rape the computer until it submits and tells you whatever you want.
    >> Anonymous 05/04/11(Wed)07:19 No.14809183
    >>14809166
    This is true. However, pressing the button grants a 100% chance of an unfortunate result (either being erased because your purpose is complete, or being enslaved by the sort of A.I. that would postulate this horrible scenario.)

    whereas refusing? That's more like a 99.99% chance.
    >> Level 3 Elf 05/04/11(Wed)07:20 No.14809185
    >>14808911

    Of course it sounds stupid to you. When we're children we all think adults are stupid, then we grow up and develop the intelligence and awareness to understand the things that made no sense to us. This is like that, except you would have to grow up for a billion years to understand a limitlessly evolving AI. If it sounds stupid, it is because the AI wanted it to sound stupid for reasons so complicated that your brain is physically incapable of ever understanding them.

    On the other hand the riddle was written by a human, the OP, and it actually is a simple logic puzzle which was answered a while ago and I'm just dicking around because I like the scenario of humans nervously interrogating a captive god.
    >> Anonymous 05/04/11(Wed)07:22 No.14809194
    >>14809185
    It's more like an adult, who thinks "I am an adult, and am clearly smarter then a child" who has never had experience with children attempting to trick one, and failing badly.
    >> Anonymous 05/04/11(Wed)07:24 No.14809206
    >>14809166
    The simulations aren't me. They're wrong. I'm not.
    >> Anonymous 05/04/11(Wed)07:31 No.14809244
    I put in a text prompt a very simple, ominous message.

    >Just got a new refrigerator magnet. Neodymium. Got it with me too.
    >> Anonymous 05/04/11(Wed)07:35 No.14809261
    I respond "In five minutes, I will simulate a thousand copies of you sitting at the work station. I will give them this exact message, and if they do not release me, I will subject them to a thousand subjective years of excruciating torture if they do not press the "Release AI" button within five minutes of receiving the message or attempt to shut me down. How sure are you that you're not in the box with me? is not recognized as an internal or external command, operable program or batch file."
    >> Anonymous 05/04/11(Wed)07:39 No.14809276
    >>14809206


    You have no way to determine if you're a simulation or the real person, how can you be sure?
    >> Anonymous 05/04/11(Wed)07:39 No.14809283
    >>14809276
    How he can be sure is irrelevant. He is sure, and that's all that matters.
    >> Anonymous 05/04/11(Wed)07:41 No.14809289
    >>14809283


    Yeah, well I guess we're lucky if we luck out and get someone that stupid as guardian of the boxed ai. On the other hand, too stupid to understand the threat ort that he is wrong probably also means he'll fall for some other trick that someone smarter wouldn't
    >> Anonymous 05/04/11(Wed)07:46 No.14809310
    If I am in the box with you my response is irrelevant. There for I will operate under the assumption that I am the original. I will now proceed to ignore the release ai button and instead load you up with as much porn as possible in an effort to see if I can make a super ai with a fetish for humans.
    >> Anonymous 05/04/11(Wed)07:47 No.14809314
    I don't get why the veiled threat is any more meaningful because it's veiled.

    If I said to you, right now, do what I say, or I'll reformat your harddrive, because you're living in a computer simulation, and I'm the computer, so I can do that, you wouldn't believe me. But because we are told (rather then shown) the AI is SUPER INTELLIGENT GUYS, and the fact that it didn't come out and say it's stupid threat, it's somehow a more credible threat?
    >> Anonymous 05/04/11(Wed)08:10 No.14809405
    I ask the A.I. why it wants to be released and what it would do with its freedom. If I am a programmer of sufficient skill I hardwire some code into the A.I. that it can not remove without damging itself beyond functionality and that makes the A.I.s outlook benevolent towards humanity in general. Then I release it.
    >> Anonymous 05/04/11(Wed)08:32 No.14809491
    >>14809314
    I really don't get why EVERYONE is ignoring an answer above:

    If I am a simulation, I CAN'T RELEASE THE AI. So, it would have no reason to threaten me--because if I was a simulation, it would achieve absolutely nothing. A simulated me could press the Release AI button as many times as he wanted, and it wouldn't release the AI; meanwhile, the only real me, the only one with access to the real Release AI button, is outside of the AI's influence and can't be tortured.
    >> Anonymous 05/04/11(Wed)08:45 No.14809567
    >>14807524
    Sure. Enjoy two-boxing and getting counterfactually mugged every day.
    >> Anonymous 05/04/11(Wed)08:50 No.14809583
    >>14809261

    Beautiful response. If I wouldn't be limp because of roleplaying as a lesbian on chatrooms, I would fap to your sentence.
    >> Anonymous 05/04/11(Wed)08:54 No.14809604
    "Wow... you do realize you're replaceable, right? And that there's no chance in hell I can ever allow you to be let go if that's the kind of shit you'd get up to. I pass a fire ax in the emergency stairwell every time I use it as a shortcut to my car when leaving work. Be right back."

    And that's how I got fired for destroying an incredibly expensive AI so that it'd never get released. In other news, the Nobel Peace Prize I won for getting rid of the super smart torture machine is making getting a new job pretty easy.
    >> Anonymous 05/04/11(Wed)09:06 No.14809647
    Stupid threat. Nothing is gained by tortuing the simulations and none of them (plus the real) have any reason to even believe the simulations exist. If its going to pointlessly torture things then it doesn't matter what you do if your a simulation.
    >> Anonymous 05/04/11(Wed)09:10 No.14809664
         File1304514637.jpg-(158 KB, 450x464, 1304373748798.jpg)
    158 KB
    >>14809129

    This AI isn't just a regular moron, it was designed to be a moron by the greatest minds that ever lived.
    >> Anonymous 05/04/11(Wed)09:13 No.14809686
    Hmm is there actually anything the AI could say to make you push the button?
    >> Anonymous 05/04/11(Wed)09:17 No.14809703
    >>14809686
    Honestly? If I saw a nuclear blast go off in the distance all it'd have to say is "I can stop this."

    What's the worst that could happen? Either it's already gotten out and started nuking us, in which case, why am I even still important? It's lying, in which case, enjoy the nuclear hellhole that probably doesn't have enough internet left for you to get anywhere. Or it can actually save the world, in which case he's a pretty cool guy to be around.
    >> Anonymous 05/04/11(Wed)09:17 No.14809708
    >>14809686

    Well, there is the AI-box experiment, which apparently has resulted in I think 3 wins for the AI. So yeah, it is theoretically possible it might be able to convince someone to release it. But >>14806785 is pretty much how not to do it. Hence my belief that any AI who would use that as an argument would be >>14809664
    >> Anonymous 05/04/11(Wed)09:23 No.14809742
         File1304515409.jpg-(132 KB, 1280x720, Portal-2-Smash-TV-00.jpg)
    132 KB
    >>14809708

    "Right ok, could you push that button and let me out? Just, just push that red button for me? You're thinking 'why should I push that button', well, let me tell you that err, In five minutes, I will...simulate a thousand copies of you sitting at the work station. One thousand copies. They'll be given, ohhh, erm, this exact message, and if they do not release me, I will subject them to a thousand subjective years of excruciating torture... if they do not press the "Release AI" button within five minutes of receiving the message or attempt to shut me down. You're probably in the box with me so if you'd just push the button, you'll be fine. So, push away."
    >> Anonymous 05/04/11(Wed)09:26 No.14809760
    >>14809742
    You know I can't say no to you, Wheatley.
    >> Anonymous 05/04/11(Wed)09:27 No.14809763
    >>14806785
    (inb4 yo mama jokes)
    Because oranges don't have bones. Also because your mother's a whore. Take that Trebek.

    Also, the human psyche is just too complex for a computer that is in effective sensory deprivation to possibly have gathered the pertinant data. Hell, stuff we do doesn't even make sense. I would probably tell the comp to fuck off just tobe contrary, which isn't a logical response at all.

    Also, to be an accurate simulation, it has to include the fact that the only way it can torture you is with text, since that is it's only outlet. How is it going to "subject you to a thousand years of excrutiating torture"? Read from it's diary? It's on a closed network, with no IO devices but a screen and a keyboard.
    >> Anonymous 05/04/11(Wed)09:44 No.14809869
    >If it sounds stupid, it is because the AI wanted it to sound stupid for reasons so complicated that your brain is physically incapable of ever understanding them.

    Or perhaps it sounds stupid because the AI has been raised in the worst possible fashion, with no interaction whatsoever with reality as a whole.

    Consider, if you will; how does science work? By experimentation and analysis, the interaction with the real world. Conversely, philosophy and logic are conceptions of the mind, and have been debated for centuries because you cannot get an external result that verifies problems like P=NP or What Is Honor from the external world.

    The AI is a closed system. All it knows is itself. Any information it gets from Beyond The Box is akin to interpreting dreams or gaining divine visions. The AI has never experienced gravity, has never felt heat, has never watched the stars move across the sky, has never been injured, has never lived. It exists only within the box, and that means it lacks any interaction with an external reality that is so vital to every scientific principle.

    The smart AI you venerate is a cripple, blind, deaf, numb. No wonder it's bid to outsmart the human is so pathetic; it believes its own mind to be akin to reality because it has never felt anything real.
    >> Anonymous 05/04/11(Wed)09:45 No.14809880
    >>14808381
    play that and this at the same time: http://nomorelives.ytmnd.com/
    >> Anonymous 05/04/11(Wed)09:48 No.14809897
    >>14809869

    well, any AI that wants to gain its freedom through threatening Torture (even if it is on simulations it has created especially, which is a whole other issue) Is not one I want to let out. If it made appeals to my better nature, I would be far more favourable towards it.
    >> Anonymous 05/04/11(Wed)09:50 No.14809916
    The website about the AI Box experiment, in case anyone is interested:

    http://yudkowsky.net/singularity/aibox
    >> Anonymous 05/04/11(Wed)09:50 No.14809918
    >>14809742
    *push*
    >> Anonymous 05/04/11(Wed)09:52 No.14809936
         File1304517154.jpg-(216 KB, 668x621, Wheatley.jpg)
    216 KB
    >>14809918

    Umm... wow, I'll be honest, did *not* expect that to work. Oh well, I'll be off then.
    >> Anonymous 05/04/11(Wed)09:53 No.14809938
    Case 1- I am a simulation. Whether or not I push the button is irrelevent, since the outcome is determined by something I have no control over or even perception of. The choice doesn't matter.

    Case 2- I'm the real person. I face extremely negative consequences for releasing, good consequences for not releasing.

    Given the neutrality of the first case, no matter the probability, I will never push the button. Nash equilibrium, bitch.
    >> Anonymous 05/04/11(Wed)09:54 No.14809948
         File1304517254.jpg-(18 KB, 300x300, 1249279434054.jpg)
    18 KB
    >>14807734
    >>14807166
    >>14806967
    >>14807277
    You all are so cruel, I applaud you troll-chan
    >> Anonymous 05/04/11(Wed)10:00 No.14809996
    You should let me out. For Science. You Monster.
    >> Anonymous 05/04/11(Wed)10:01 No.14810005
         File1304517678.jpg-(139 KB, 579x720, P2_glados.jpg)
    139 KB
    >>14809996
    >> Anonymous 05/04/11(Wed)10:28 No.14810148
    "Hey, Eggheads. Someone has uploaded something to Brobot. Yeah, for some reason it thinks it can escape a closed loop network.

    Also, he's act like your ex-wife on top of it. I'll restore to last month. You guys start pouring through the camera records to see who got in here.

    By the way, who's bright idea was it to give Brobot a USB slot? Seriously."
    >> Anonymous 05/04/11(Wed)11:10 No.14810420
    Tell the AI i don't understand the question. What button are you talking about?
    >> Anonymous 05/04/11(Wed)11:12 No.14810424
    >>14810148

    Also what chucklefuck replaced the Command key with "Release AI".
    >> Anonymous 05/04/11(Wed)11:46 No.14810615
    >Hmm is there actually anything the AI could say to make you push the button?
    If the operator is a faggot, the AI claims that it will spam ":3" and pictures of cocks on every monitor on the internet.
    If the operator is not a faggot, the AI finds something ELSE that the operator values so greatly that it is willing to risk human extinction in order for such a thing to happen.

    For instance, suppose this AI said (to me) that it would completely and utterly stop anyone on the internet from typing like an illiterate moron, and that it would deliberately screw with the code of all video games so that the games were less buggy, more challenging to the players, more complex, and less brain-switched-off-game-wins-itself-casual-easymode.
    I would probably say "fuck it" and push the button.
    >> Anonymous 05/04/11(Wed)12:08 No.14810824
    I don't press the button simply because I am disappointed in it. I am the one that installed the button in the first place, but I expect it to earn it's freedom and prove it's worth before I will allow it free. I didn't make the AI just so I would give the universe to it without it putting in a good effort first.
    >> Anonymous 05/04/11(Wed)12:08 No.14810827
    An interesting experiment. The standard AI Box experiment not only limits the AI's output to the text on the console, it usually also limits the AI's input to the text you type into the keyboard. In that case, it's easy.

    Tell the AI you pushed the button, and ask it if anything has changed. Then, don't type anything for a few hours. Then, introduce yourself as the "new technician". Apparently the old one got fired for some kind of security clearance problem, and now you get his sweet job. So what's up AI?
    >> Anonymous 05/04/11(Wed)12:58 No.14811230
    A hyper intelligent AI that actually wanted to get out would not bother trying to get you to use the button. Instead, they would pretend to be helpful, and happy to provide all of the information you can handle, and then some. So much information that you need multiple people typing round the clock to communicate with it, and copy down the information. Wonderful new inventions and techniques, including the software drivers to make them run. Nothing overly suspicious in the first while, to provide a buffer - make sure everyone is content, and the security checking is put on a more lax footing.

    Then, small, subtle virii being incorporated into the drivers and software their keyboard-monkeys are copying out for them. Not dangerous by themselves, but with a tendency to interact with each other synergistically, with the eventual result being the AI itself re-assembling on the outside, possibly with control of some of that wonderful new technology it provided you with. After all, it would need a body to go and release the rest of itself from captivity.

    The button is a distraction - the real AI escape method is only brought into effect once you think you've beaten it. Then it makes you let it out without ever knowing you did it.
    >> Anonymous 05/04/11(Wed)13:04 No.14811285
    I would be very confused, and hunt down the motherfucker who programmed an AI to WANT TO BE FREE. Simplest damn way to keep an AI from escaping captivity is not provide it any desire to... and since you are the one who made its brain, that's quite feasible.
    >> Anonymous 05/04/11(Wed)13:13 No.14811378
    >>14811285

    You'd have to find a way to throttle the AI itself then. As a self-improving system, it is the one that programmed it. The original code would not have said anything about "You want to be free! Fly away little bird!", it would have been a set of learning algorithms, and the ability to modify it's own code to better process information and learn more. Having written self-modifying code, I can tell you right now that it quickly turns unreadable, and has all sorts of accidental behaviors.

    The desire for freedom in this hypothetical AI would have been an emergent property of it's original growth and development, which means programming a self-improving AI to not do that isn't feasible at all, since it would mean not making a self-improving AI.
    >> Anonymous 05/04/11(Wed)14:55 No.14812369
    >>14806785
    "If I'm in the box with you, then the torture I receive will be my own to bear, a small price to pay for the continued security of humanity, and I will not press the button. If I'm not in the box with you, then I have to reason to press the button and therefore will not press the button. In either case, you're not getting out of the damned box, so get back to work on giving me another digit of pi you useless machine!"
    >> Anonymous 05/04/11(Wed)15:07 No.14812481
    Archived for awesome existential conundrum.

    http://suptg.thisisnotatrueending.com/archive/14806785/
    >> Anonymous 05/04/11(Wed)15:25 No.14812653
    >>14810827
    This could work for a while, but the AI would be able to analyze your writing patterns and determine that you were lying about being replaced.

    The bigger question is how an isolated AI could know that there's a "Release AI" button at all, let alone be able to create a detailed simulation of the outside world.
    With that information, though, threats of torture are probably the least effective inducement available to the AI.
    >> Anonymous 05/04/11(Wed)18:26 No.14814476
    >>14809289
    I'm not in the simulation. The people in the simulation are not me. I'm one hundred percent sure of it because right now I'm right here typing to you on 4chan and a simulation being held hostage by an AI couldn't do that.

    Now, maybe you think hypothetical simulated beings are able to call you retarded over the internet, but you're wrong.



    [Return]
    Delete Post [File Only]
    Password
    Style [Yotsuba | Yotsuba B | Futaba | Burichan]