Jake

Important If True 21: The Real Monkeys

Recommended Posts

Important If True 21:

Important If True 21


The Real Monkeys
There are more questions, problems, and conundrums out there than can possibly be addressed in a lifetime, but we're doing our part, starting with these: Why did someone hide a room full of monkeys out in the desert, and what are the monkeys inside the room hiding from us? Would you impress your friends with a third thumb, or just sit at home wishing for a fourth one? Is that bee really lashing out because it smells your fear, or is it just put off you didn't consider its feelings? Join us as we chip away at the truth, until we hit pay dirt, or rock bottom, whichever comes first.

Send us your questions at [email protected]. If you enjoyed this and would like to subscribe to an ad-free feed, please consider supporting Idle Thumbs by backing our Patreon.

Discussed: weird Arizona monkey farm, armed clown monkeys coming to steal your guns, extra prosthetic thumb, extra prosthetic thumb guitar lord, Star Wars brain-powered force trainer toy, AI-monitored money game show, our social contract with bees, Bee Movie, sinking into a chair, Jumanji, Jumanji-ing yourself into a board game, Jumanji-ing yourself into a painting, Jumanji: Welcome to the Jungle

Chris' Endorsement: The Big Sick

Jake's Endorsement: Supermute for Twitter

Nick's Endorsement: Maangchi Korean cooking YouTube series

Sponsored By: Quip electric toothbrushes, Warby Parker prescription eyeglasses and sunglasses

 

Share this post


Link to post
Share on other sites

I am the guy who sent in the game show question.

 

To provide non-hoisting assurance: https://en.wikipedia.org/wiki/Newcomb%27s_paradox

 

If you follow Nick's advice, you will have literally left $1,000 on the table. The AI made its choice before the show started, so literally nothing is stopping you from taking both... unless the AI predicted you would do that before you walked on stage, in which case the closed box is empty. 

 

Also, unsurprisingly, Nick thought "the AI guessed correctly 50 times" equated to "the AI is infallible", which I probably should have anticipated considering your beliefs about robots.

Share this post


Link to post
Share on other sites

Even in its original formulation, that logic problem just comes across to me as vague to the point of incoherence, but I'm not at all versed in that kind of thing. 

 

 

Also, I thought the primates were there for ammunition testing. You can get a good idea of how a bullet will affect a human subject by shooting a primate. 

Share this post


Link to post
Share on other sites
1 hour ago, Salacious Snake said:

Even in its original formulation, that logic problem just comes across to me as vague to the point of incoherence, but I'm not at all versed in that kind of thing. 

 

 

Also, I thought the primates were there for ammunition testing. You can get a good idea of how a bullet will affect a human subject by shooting a primate. 

 

That's bleak! I had thought a similarly bleak thing: It's a place to raise monkeys used for medical/drug testing by either that specific university, or many universities.

Share this post


Link to post
Share on other sites
32 minutes ago, Jake said:

 

That's bleak! I had thought a similarly bleak thing: It's a place to raise monkeys used for medical/drug testing by either that specific university, or many universities.

 

I thought an even grosser combination of the two.  They make weapons there + they need primates nearby = biological weapons testing.

 

17 hours ago, LyonArtime said:

 

If you follow Nick's advice, you will have literally left $1,000 on the table. The AI made its choice before the show started, so literally nothing is stopping you from taking both... unless the AI predicted you would do that before you walked on stage, in which case the closed box is empty. 

 

Also, unsurprisingly, Nick thought "the AI guessed correctly 50 times" equated to "the AI is infallible", which I probably should have anticipated considering your beliefs about robots.

 

I dunno; I think both of those are a bit unfair.  I'm not a philosophy-man, but it seems to me that the whole point of the exercise is that it makes for a weird unsolvable paradox where you can't say right now what you should do later, because there's an intervening factor that is a near-infallible or infallible prediction about what you will do later, and that prediction will be based on what you do from now until then.  Looking at the moment in time where you're deciding which box to pick suggests that you have some control at that point, but really the question comes down to whether you're the kind of person who will pick A, or the kind of person who will pick B.  The $1.0001 million solution is "be the kind of person who will pick the closed box, but then don't do it."  Which is paradoxical.  It's not really true that you will have literally left $1,000 on the table, unless the paradoxical situation obtains, in which case, paradox.

 

So Nick's original solution -- if you don't assume you can trick the nigh-infallible AI/gamer-god -- is optimal.  Be the kind of person who will take the million, and then take the million, and don't even try to f with the reverse causality hoist bomb that's waiting if you try to get a little extra.

 

The way the question is framed makes it irrational to assume otherwise, I think.  It's the classic genie setup: you don't know exactly what weird rules govern this interaction, and you can profit slightly more greatly if you push the margins and come out ahead... but also you're standing on the mangled corpses of all the people from before who tried to game the system, and there's a million dollars in that box if you just deal with it.

Share this post


Link to post
Share on other sites
1 hour ago, nelmis said:

The $1.0001 million solution is "be the kind of person who will pick the closed box, but then don't do it." Which is paradoxical. It's not really true that you will have literally left $1,000 on the table, unless the paradoxical situation obtains, in which case, paradox.

 

So Nick's original solution -- if you don't assume you can trick the nigh-infallible AI/gamer-god -- is optimal. Be the kind of person who will take the million, and then take the million, and don't even try to f with the reverse causality hoist bomb that's waiting if you try to get a little extra.

 

It's actually an open question whether or not it's paradoxical to "be the kind of person who will pick closed box, then don't do it." It depends on your understanding of how much people can manipulate their own intentional states over time, and is only possible in this case because there's a time lag between when the AI makes its decision and when you have to make yours. Check out the "Influencing the predictor" section of the wiki and the page on the Toxin puzzle it links to from there for more discussion on this. 

 

Also, even though this is only a reverse causality problem if the AI is literally infallible, I'm totally using "reverse causality hoist bomb" as a description of these kinds of problems forever now (given your permission, of course lol). 

Share this post


Link to post
Share on other sites

I would say picking the closed box is definitely the way to go. Either you get a million dollars, or you get to hoist this supposedly infallible AI, which is even more valuable. Priceless even.

Share this post


Link to post
Share on other sites
5 hours ago, LyonArtime said:



 

It's actually an open question whether or not it's paradoxical to "be the kind of person who will pick closed box, then don't do it." It depends on your understanding of how much people can manipulate their own intentional states over time, and is only possible in this case because there's a time lag between when the AI makes its decision and when you have to make yours.

This is basically immediately where I went with the question. My interpretation is that if you're someone (person A) who will mitigate risk primarily, the AI will only put 1000 dollars in the boxes, whereas if you're someone (person C) who will go for the biggest payout, the robot will ensure that there is 1 001 000 dollars in the boxes. But then my mind went down the AI's yomi of "well this person has been very risk-seeking in moderate situations but this extreme situation would cause them to be more risk-averse, so I'll only put $1 000 in the room." Because the AI doesn't just want to predict how you normally act, it wants to predict how you'll act in that exact moment, which are different things.

 

 

I actually really like Nick's almost zen-like response of "trust the robot will be right, and pick the closed box" even if there is a chance to fail, rather than trying to outplay the AI.

 

Share this post


Link to post
Share on other sites

Risking a million, even slightly, for an extra thousand dollars is idiotic. A thousand dollars is like twelve day's interest on a million dollars. I'll take the 1-2% risk of being the AI's first failure over that kind of marginal benefit.

Share this post


Link to post
Share on other sites

To Nick: You have not yet used the following intro variations (I think).

 

This is Important if True???

This is Important IF True

This is ImporTANT if True

This is Important if Trueeeeeee

Esto es importante si es cierto

Share this post


Link to post
Share on other sites

I don't know if anyone else is having this problem, but the last two episodes of IiT haven't been downloading properly for me. I use the ad-free feed on my iPhone's Podcasts app and every time I try to download them it goes through the whole progress circle before informing me there was an error and I couldn't download the file. Idle Thumbs still downloads fine, and I haven't tried the public feed with ads but I thought I'd throw this out there and see if anyone else has had this issue.

Share this post


Link to post
Share on other sites

So the monkeys aren't our first point of defence against the robot uprising? I'd have thought robots would be trained to kill all humans, and so having a stronger, more agile army to call upon could only help us not lose immediately. 

Share this post


Link to post
Share on other sites
11 hours ago, Patrick R said:

I don't know if anyone else is having this problem, but the last two episodes of IiT haven't been downloading properly for me. I use the ad-free feed on my iPhone's Podcasts app and every time I try to download them it goes through the whole progress circle before informing me there was an error and I couldn't download the file. Idle Thumbs still downloads fine, and I haven't tried the public feed with ads but I thought I'd throw this out there and see if anyone else has had this issue.

Weird! I haven't seen any other reports of this. That is disconcerting though.

Share this post


Link to post
Share on other sites

I will try deleting and re-subscribing to the feed tonight and see if that does anything. But if you haven't heard it from anyone else, I'm gonna guess it's something on my end.

Share this post


Link to post
Share on other sites
On 7/8/2017 at 5:42 PM, Salacious Snake said:

Even in its original formulation, that logic problem just comes across to me as vague to the point of incoherence, but I'm not at all versed in that kind of thing. 

 

 

Also, I thought the primates were there for ammunition testing. You can get a good idea of how a bullet will affect a human subject by shooting a primate. 

My first thought as well.

 

Maangchi is a great endorsement, I'm addicted to her kimchi.

Share this post


Link to post
Share on other sites

The paradox of the two boxes comes in when you imagine a third party who can see inside both boxes as you are contemplating your decision.

 

The third party, who sees either [$1,000] + [$0] or [$1,000] + [$1,000,000], will *always* advise you to take both boxes.

Share this post


Link to post
Share on other sites
22 hours ago, Dr Mario Kart said:

To Nick: You have not yet used the following intro variations (I think).

 

This is Important if True???

This is Important IF True

This is ImporTANT if True

This is Important if Trueeeeeee

Esto es importante si es cierto

 

Since we're in a logico-philosophical vibe:

 

This is important if and only if it is true.

Share this post


Link to post
Share on other sites
8 hours ago, Patrick R said:

I will try deleting and re-subscribing to the feed tonight and see if that does anything. But if you haven't heard it from anyone else, I'm gonna guess it's something on my end.

 

So I deleted and re-subscribed to the feed and it didn't help. I can stream the podcasts from the app, but not download them. Commercial-ed version downloads fine for me, so I guess I'll stick with that feed for the time being until whatever it is resolves itself.

Share this post


Link to post
Share on other sites

That type of logic problem is usually solved by creating a grid of all possible outcomes, then weighing the probability of them to determine what your best outcome is. Often the problems are framed with a "perfect logician", rather than an AI.  Basically that no matter how smart you are, they will know what you're going to do.  All other details about being correct 50 times in a row are probably just ground work to say "they're perfect".  But honestly by changing the question and giving us some statistics, it actually puts some new weighting on the probability, though I don't think it's that significant.

 

Nick's reasoning is a "dominant strategy".

 

The most famous example is probably The Prisoner's Dilema:

 

Also here's one about pirates that are perfect logician's:

 

 

 

Share this post


Link to post
Share on other sites

Also on a personal note, I think the numbers need to be readjusted so that expectation value isn't a tempting distraction.  People are right that missing out on $1,000 doesn't seem like a big deal when $1,000,000 is on the table.

 

If this truly was an expect observer, what they would do is come up with 2 amounts that would cause the contestant the maximum amount of suffering during the decision phase.  And if it's main objective wasn't preserving money but driving ratings to the show, then it would also chose based on an observation of the audience rather than the contestant.  Going with whatever option gives them the most tense viewing experience.  Which is basically what they do in Deal or No Deal.

Share this post


Link to post
Share on other sites

This actually becomes an easier choice I think the huger and more life-changing the amounts involved are. If there's a way to consistently get enough money that you will probably never need to worry about money again, that's always what you're going to choose -- I would take a sure 1 million over a 50:50 shot at 10 million any day, even though a naive assessment would put the latter as a 5 times as valuable play. However, if it becomes a contest between risking 2 months rent to get 6 months rent, it becomes a much more agonizing choice.

 

This sort of thing comes up a lot in game balance. I wrote a bit about how weighting these choices often goes wrong in games a while ago.

Share this post


Link to post
Share on other sites

But the paradox is that once you get to the decision phase, the AI has already either put the money in the box or not.  You can't change that by your decision.  You can't change the past by choosing just one box, so you might as well just go ahead and take both boxes.  Anyone who could see the contents of both boxes would tell you to take both boxes.

Share this post


Link to post
Share on other sites
13 minutes ago, Urthman said:

But the paradox is that once you get to the decision phase, the AI has already either put the money in the box or not.  You can't change that by your decision.  You can't change the past by choosing just one box, so you might as well just go ahead and take both boxes.  Anyone who could see the contents of both boxes would tell you to take both boxes.

If you take both boxes and the observer is infallible, it will have known that you are type of person who would choose both, otherwise it would a very high chance of guessing correctly.  Either way, it's a much better idea to simply resolve to be the kind of person who would take one and then go through with it.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now