TychoCelchuuu

Members
  • Content count

    2800
  • Joined

  • Last visited

Everything posted by TychoCelchuuu

  1. Haha, Chris's recommendation is awesome. I've only listened to Jessie's Girl and Mr. Tambourine Man so far and they're both great.
  2. Movie/TV recommendations

    The Decameron has lots of rape, if that helps at all.
  3. I don't even know what the Idle Thumbs hive mind thinks. The hive mind thinks Dan is evil? When did we decide? Are there meetings where the hive mind's decisions are made? I never get invited!
  4. I searched for the book on Amazon and it looks like he's also written "Air Force Gator" and "Air Force Gator 2?"
  5. I've o nly ever heard about Dan (or any other Giant Bomb stuff) second hand from stuff like this thread and he seems like kind of a strange guy. He just doesn't like any vegetables?
  6. Tanks of Caveman

    Clearly the right choice here is tanks vs. cavemen.
  7. The Big FPS Playthrough MISSION COMPLETE

    Are you sure nothing's fucked up in your audio setup, like in Windows or whatever? If Windows thinks you have 5.1 surround sound but you only have 2 speakers and it's piping all the dialogue through the center speaker that doesn't exist or whatever, that might explain that sort of thing. Although if this reaches back even to the old Lucasarts adventures then chances are it's not technical.
  8. I'm a grad student and I do social and political philosophy.
  9. As long as whatever the decision is, it's something that occurs after the money is in the box, we don't have to get more specific. Something that vague is fine, because once it's too late to change what's in the boxes, we get the division between causal and evidential decision theory.
  10. Only if anything changes in between the prediction and the future, which as far as we know is not going to happen (the laws of physics don't seem to be going anywhere) but then again who knows? If "decision" is this loosey goosey then it's not clear where you get off saying anything at all about the right "decision" to make when presented with this choice, since it's all effectively a load of bullshit that depends on whatever arbitrary description you pick to use for the sake of convenience.
  11. Also, Chris's endorsement is right on the money in terms of making your kitchen look nicer, etc. but if you have less money I suggest containers from US Plastics which can be cheaper and more durable. They don't seem to have the ones I bought (like 5 years ago) in stock anymore but that site is worth a look if you store lots of dry bulk goods.
  12. It's only too much if all you're looking for is profound sports-adjacent goofery! For people who either aren't looking for that, or who are looking for that but also high-concept sci-fi, it's perfect. I've only read like half of it so far but I love it.
  13. It's the difference between being good at math and physics vs. being able to travel through time. One is perfectly understandable and the other contravenes the laws of physics. I was merely pointing out that "it's impossible for the predictor to err" is something you're bringing to the table. That's not part of the original setup. The original setup is just that the predictor is very very good and so far has not made any mistakes. I think this is an indefensible view of what a "decision" consists of, and if you were forced to work out all the implications of your view you'd quickly find yourself unable to defend it. But, showing this is a lot of work, because it requires providing an entire theory of what it is to decide, which is very difficult. Suffice it to say that I think on your view, it turns out people either never make any decisions, or they only ever make one decision.
  14. Nobody has ever claimed you can jump the tracks, though. Your mistake is in thinking that it's possible for the predictor to err. Perhaps the predictor cannot err. Perhaps the predictor is 100% right because it knows physical mechanics well enough to predict your choice with 100% accuracy. That's fine. That doesn't disagree with a single thing I've said. (It's not really part of the setup of the problem because it's irrelevant either way. We can imagine the predictor has 100% accuracy because of physics, or we can imagine the predictor doesn't, and they're right because they're very good at psychology, or whatever. Again, it doesn't matter.) My point is merely that, according to a causal decision theorist, the right choice is two boxes. Maybe it's impossible for you to make that choice, and the predictor knows it's impossible, so they put the money in the box, because they know you'll one-box. Still, you're making the wrong decision. You should two-box (if you could). You can't, of course. You're stuck making the wrong decision. As you put it, you'd have to spontaneously change the mechanical composition of your brain, which is impossible. But just because you're forced to make a decision, this is no reason to think it's the right decision. According to the causal decision theorist, it's the wrong decision. (According to the evidential decision theorist, it's the right decision! So that's good news.) That's not literally getting information from the future. That's getting information from the present and the past and using it to predict the future. The way to see that this is not literally getting information from the future is to imagine that the laws of physics change ten minutes from now, after you've made your prediction. Your prediction can't account for this: there's no way to know the laws of physics were going to give out! But if you were literally getting information from the future, you'd know the laws of physics were about to change.
  15. The cause of the box being full of money is not making the decision to take the box. That's literally impossible. It would require something now (the decision to take the box) to cause something in the past (the AI predicting you will take the box). Causation does not work backwards in time like that. Causation can only work forwards in time. Things right now can only cause things subsequent to now. But that's only because you're a one-boxer. If you were a two-boxer, you'd follow the same line of reasoning except you wouldn't put the money in the box. This has nothing to do with one-boxing vs. two-boxing. It just has to do with making a good prediction. And of course the prediction depends on what kind of person you are making a prediction about. This is half correct. There is a clear causal relationship between your decision now and what's going to be in the box. You, the predictor, get to pick what goes in the box. There is no causal relationship between your decision later and what's going to be in the box, unless by "your decision later" you mean "your prediction now of your decision later." Your decision later can't have any causal effect on what's happening now, because your decision later is in the future, and the future cannot causally impact the present. That would require time travel. Acknowledging that the machine is a reliable predictor does not require saying that the machine is beyond the realm of causation. The only time we say the machine is beyond the realm of causation is if we say it is not predicting the future on the basis of information it has right now, but rather being influenced by the future by somehow getting information from the future right now. That second thing is impossible. We cannot get information from the future right now. That would require time travel. Why is it difficult to swallow the idea that getting information from the future requires time travel? That's how time travel works: it takes things from the future and brings them backwards. Without time travel, things cannot go from the future to the past. They can only go from the past to the future.
  16. I am not sure where I suggested that this specific case is good at highlighting the relative strengths of these two decision theorists. Where exactly did you get the notion that this was the goal of the case? As I've said a few times in this thread already, the point of the case is to show how the two theories differ in their answers. This is one of the reasons why, in my very first post in this thread, I said the presentation of this problem in the podcast was not very good, and that the right way to think about the problem is to think of it as highlighting the difference between the two decision theories. I'm not trying to be a jerk here, but it seems like you're not even reading my posts? I've been at pains to explain to you that this is not a paradox or dilemma, it's just a way to divide two decision theories up, etc. Like, you're happy to spend your time accusing philosophers of wasting our time on bullshit and semantics and so on, but when a philosopher is trying to explain to you why we're not all wasting our time, you just ignore everything he says and you substitute your own views about what's going on? Do you see how this might strike me as a little objectionable and frustrating? You're accusing me of wasting my life on worthless bullshit and when I try to tell you I'm not, you ignore what I say and continue to claim that my entire field is full of clueless bozos or cranks or both. No, that's the correct decision according to evidential decision theory. The correct decision according to causal decision theory is to take both boxes. Now, you might think that really the question isn't "how many boxes should I pick" but "which decision theory should I pick." That's fine. You're more than welcome to ask that question. The question we're talking about right now is not that question. The question we're talking about right now is how many boxes to pick, and you can only answer that question once you pick a decision theory. If you pick causal decision theory, the right answer is two boxes. There is no disputing this. Everyone agrees that, given causal decision theory, you should two box. Everyone also agrees that, given evidential decision theory, you should one box. What's interesting is that the two theories diverge, and also that there's no general agreement about which decision theory to pick. If Newcomb's Box were the only decision people ever faced, it would be clear which decision theory to pick. But we face many decisions in our lives, none of which are anything like Newcomb's Box. On the basis of these other decisions, many philosophers think that we should be causal decision theorists. Because of this, these philosophers think we should pick both boxes if faced with the Newcomb's Box problem. One of the classic cases used to argue for causal as opposed to evidential decision theory is the case described here about smoking lesions.
  17. Yes, causal decision theory does look foolish in this case. That's one interesting feature of the case, because normally causal decision theory looks fine. However, there are cases that make evidential decision theory look foolish, and which make causal decision theory look much better, so overall it's not a huge deal. In effect, everyone agrees causal decision theory is a bad way to approach the problem if by that you mean this is the only decision you will ever have to make in your life, and if this is the only decision you'll ever have to make in your life, obviously you should be an evidential decision theorist. In reality, though, we make lots and lots of decisions in our lives, and in fact we never come up against the Newcomb's Box decision, so causal decision theory is not in much hot water merely because it fails to handle one (very weird, very specific) case very well. Mostly the interesting thing about the Newcomb's Box problem is that causal decision theory and evidential decision theory give different answers. Prior to this, one might have thought they always gave the same answers, and in fact one might not have realized that there are two ways of making decisions in the first place. So, Newcomb's Box is helpful because it helps us think about the ways that we make decisions. And yes, one small incidental feature is that it makes causal decision theory look slightly bad, because it can't handle this case well, but then again, nobody will ever face this case in their lives, so it's not a huge deal if causal decision theory can't handle it very well. It's like saying nobody should ever go to culinary school at the most prestigious chef university because they don't teach you how to cook pickled yak testicles, and one day someone might threaten to kill you unless you perfectly cook pickled yak testicles. It's like, well, okay, that's true, if that's the criterion then I shouldn't go to the fancy culinary school. However, this is super unlikely, so probably I'm going to make up my mind based on other stuff, like whether the culinary school is likely to get me a job. Similarly, if someday you're going to face Newcomb's Box, and also you won't have to make any other choices, then probably you should just go ahead and be an evidential decision theorist. But that's not what our lives look like, so nobody is ever really convinced to abandon causal decision theory just on the basis of its failing to handle this super weird, implausible case very well.
  18. Yes, but that is a misnomer. Also, it's not always called Newcomb's Paradox. It is more often called Newcomb's Problem.
  19. The solution is to pick the opaque box, if you're an evidential decision theorist, or to pick both boxes, if you're a causal decision theorist. The AI accounts for whatever it accounts for - it's not particularly important how it makes its decision except that however it does it, it's very good at doing it, and also the decision in this particular case (for your boxes) has already been made. We are not arguing over what kind of person to be while the AI makes its prediction. That is not an interesting question. Everyone has the same answer to that question. The answer to that question is to be a one-boxer. But that is not the question we are asking. The question we are asking is how many boxes to pick, once the prediction has already been made. That question has two legitimate answers: one box or two boxes. Evidential decision theorists recommend one box. Causal decision theorists recommend two boxes. You misunderstood my point. My point is not that effects do not have causes. My point is that causes cannot cause effects in the past. Causes can only cause effects in the future. Neither decision theory relies on any such belief. Your mistake is believing that causal decision theory tries to "escape causality." It does not try to escape causality. The causal decision theorist accepts that it's too late to escape causality. The box has already been determined. It either already has the money or already doesn't. It's too late to fix that. Your only choice here is how many boxes to pick: one or two. And two is the better choice. Maybe it's literally impossible for there to be money in the box if you've been a causal decision theorist in the past. That's fine. Maybe causal decision theorists are stuck always getting $1,000. They're okay with that. Casual decision theorists don't say that there view will always net them $1,001,000. Maybe their view will always net them $1,000. That's okay. It's still the right decision to pick two boxes, according to causal decision theory. I'm not preemptively eliminating this as a possibility, I'm speaking from experience, having spent many hours of my life studying this question in quite a bit of detail. You, meanwhile, don't even understand it yet, let alone have a particularly interesting take on the problem. I think it is a little presumptuous of you to declare an entire field of people (a field that includes me), people who are not idiots (I don't think I'm an idiot), as having wasted its time on stupid problems, mere semantics, and sloppy thinking. My claim is not that it's impossible that this has ever happened, but merely that it hasn't happened in this case, and you're going to have to do better than presenting mistaken understandings of the problem (and bad conclusions supported by these mistaken understandings) in order to convince me otherwise about this case.
  20. I know you're joking, but the point of this problem is that the causal decision theorist would say that even if the clairvoyant does react, you should still take both boxes. For whatever reason, people sometimes get angry when presented with this sort of reasoning and insist that two boxers are a bunch of idiots, etc. But they aren't, and it can be interesting to think about why it would make sense to pick both boxes even if you're given good evidence that this person is clairvoyant. If you can figure that out, you can see what the causal decision theorists are talking about.
  21. I already demonstrated why it's not a risk. Why is picking both boxes a "jackass" move? It's not a 'paradox' at all. It's just an illustration of the different outcomes generated by different decision procedures. There is no "picking apart the timeline in a highly specific way rather than creating an actual strategy." Both decision procedures have an actual strategy - in fact, they have very detailed strategies that you can read up on here if you're interested. If you don't want to read up on it, please just trust me when I say there are definitely "actual strategies" out there that pick two boxes, and for good reason. According to evidential decision theory, yes. According to causal decision theory, the best play is to take two boxes. This is the entire point of the thought experiment: it shows us that the two decision theories come apart in certain specific cases, which is interesting. It's impossible to "fuck with the prediction." The prediction happened in the past: it is now immune to fuckery. No matter how badly you wish to fuck with it, it's immune to being fucked with, the same way you can't influence anything else that happened in the past. This is how causation work: stuff in the past is immune to being changed by stuff in the future. This is why the 2-boxing strategy is recommended by what's called "causal" decision theory. That decision theory is focused on what effects you could possible cause, right here, right now. Because you can't possibly change the amount of money in the boxes, causal decision theory recommends picking both boxes, because no matter what's in them, that strategy will give you more money. It's not an intractable problem. It's not even a problem! It's simply a thought experiment designed to illustrate the differences between evidential and causal decision theories. Your mistake is thinking about is as a paradox or a problem or whatever. Nobody's up at night sweating whether to pick one or two boxes. The choice is obvious. What's interesting is that the obvious choice differs between these two theories. Of course, there might be a nearby intractable problem: which decision theory should I pick? That's not what we're talking about here, though. This problem is quite tractable. Simply pick your favorite decision theory and it'll tell you what to do. Again, this is not a "paradox," and to the extent there is any sloppy thinking going on, it is entirely in your corner. Believe me when I say that I and other philosophers don't waste our time on stupid problems, tricks of semantics, or sloppy thinking. We're not all a bunch of fucking yahoos.
  22. The problem is that causal decision theorists don't screw themselves out of anything with their bad decision-making, if by "bad decision-making" you mean picking both boxes. The only thing that can lose you the $1,000,000 is the predictor predicting that you'll pick two boxes. Your actual choice is irrelevant: it's your actions prior to the prediction that matter. If your point is just that being a causal decision theorist while the AI evaluates you is bad, that's obviously true. Nobody disagrees about that. The causal decision theorist does not say "you should be a causal decision theorist during the time the AI is observing your behavior." That would be goofy! If you think one side of this debate is doing something goofy, you're missing the point. Neither side is goofy. So, the non-goofy thing that the causal decision theorist is saying is that, once it's time to make the choice, you might as well pick both boxes. By that point, the prediction has already been made, and you're guaranteed to get more money if you pick both boxes. You get more money if the opaque box is empty: you get $1,000 vs $0. You get more money if the opaque box is full: you get $1,001,000 vs. $1,000,000. So, either way, you get more money if you pick both boxes. So you should pick both boxes! You're not taking any "risk"by picking both boxes, according to the causal decision theorist. It's too late to risk anything by picking both boxes. The AI has already put the money in the box. Nobody can take the money out of the box. It's there for you to take. If there's no money in the box, the only risk is taking that empty box rather than that empty box plus the $1k box. So of course you shouldn't take that risk.
  23. The entire point of this box thing is to illustrate the difference between two decision making procedures which have been proposed by philosophers. According to evidential decision theory, the right decision to make is the one that is based on the evidence available to you. Because the evidence suggests that the predictor is correct, you want to pick the single opaque box, because that way the correct predictor will have put $1,000,000 in the box. According to causal decision theory, the right decision to make is one that is based on what consequences your choice could have. Because it's too late for your choice to influence how much money is in the boxes, you should pick both boxes, because that way you get more money. You get more money if the opaque box is empty, because you get $1,000 vs. $0. You get more money if the opaque box is full, because you get $1,001,000 vs. $1,000. So either way, you should pick both boxes. It was not presented super well in the email sent to the podcast for various reasons. For instance, it's important to make clear to people that your goal is to make the most money possible - this screens off choices like Chris's "I'll pick one box because $1,000 is not enough to make a big deal in my life," which is a sensible choice in real life but which doesn't get at the point of the thought experiment. On the other hand, the thought experiment is really only useful for dividing evidential from causal decision theory, and that's a question basically nobody cares about, so maybe it's fine to get something like Chris's response. You can find more details here if you're interested.
  24. The Big FPS Playthrough MISSION COMPLETE

    It turns out the lock has been inside you all along.