mkenyon Posted September 23, 2013 Actually I'm not sure whether to go for streaming only or something that can play more demanding games by itself as well. If I built a real gaming PC I'd go for something high end and then I definitely wouldn't afford XBone or PS4 for a while. Anyway, haven't even gotten a TV yet. The size of the box I probably don't care that much about, main thing is that it should be quiet. http://www.neogaf.com/forum/showthread.php?t=600246 For something that is at home next to a TV: http://www.neogaf.com/forum/showpost.php?p=65541586&postcount=3 My two cents there: if you were already thinking of buying an Xbone or PS4, you should just go for it. They'll never be more exciting than they are at the beginning and the state of PC gaming will undoubtedly be in flux around their release. Even though they don't promise to be wildly more powerful than PCs, the moving target of how games are developed, how much video memory they'll use and whatnot will be decided in the next year or so and you'll know for sure if a $300 4GB GPU is actually a worthwhile investment or if multithreaded CPUs are going to benefit gaming meaningfully (they haven't, despite threatening to this generation). This would probably be really great advice in the past, but this time around things are really different. The CPUs in the xbone and PS4 are more in line with the new iPhone CPU's (kid you not) than they are desktop processors. Even assuming 100% scaling on the 8 cores, it puts the total power just over a single core of a Sandy/Ivy/Haswell i5. Graphics, again, you can have all the VRAM in the world, but you need the ability to render said data. The APU that's in the consoles isn't anything to really write home about. This isn't like when the PS3 or 360 came out when they were challenging the best of the best PC's at the time. That's just on a hardware level. Once you start to look at the possibilities of where SteamOS can go, it's a really exciting future. Share this post Link to post Share on other sites
Frenetic Pony Posted September 23, 2013 Video RAM is, of right now, very overblown. Some comparative testing of Kepler cards between the 2GB and 4GB variants are showing no difference in performance, even when the VRAM is loaded up to the max. They've only been able to do this by cranking AA levels and/or downsampling though. Low powered APU in ITX form factor being served up by a proper tower in a closet seems like the ultra-nerd way of the future. Talking about next gen man, not console ports of this generation that stick to optimized 512mb, Vram doesn't do anything for performance in terms of size, but you need for bigger games sitting on the PS4/Xbone. It's also much easier (read: really sort of possible rather than fantastically difficult) to optimize for CPU and GPU specific chips than fit more into ram than you have. As devs optimize for the GCN Next ISA and even the specific CPUs and GPUs in the Xbone and PS4 the equivalent chips for PC you need will go up (BTW the CPU on the PS4/Xbone are actually nearly double the equivalent speed of a iphone 5 soc already). So, yes you do need video ram, you do need decent enough CPUs and GPUs. You don't need some MASSIVE advantage like you did at the launch of the 360. But you're aren't going to get away with some cheap PC components and expect to stay future proof for very long at all. Share this post Link to post Share on other sites
mkenyon Posted September 23, 2013 Talking about next gen man, not console ports of this generation that stick to optimized 512mb, Vram doesn't do anything for performance in terms of size, but you need for bigger games sitting on the PS4/Xbone. Maybe. Any evidence we have right now is still saying no. It's not definite though. Scott Wasson, from TechReport on the subject: I don't have any data from my latest high-res testing that we did for the 7990 review, there were some comparisons in there where we had some 2 gig cards, like the 690, and I don't believe we ran into any problems with current games even at really high resolution, high AA, high texture quality, as much as we could crank it, on a single 27" display, and I didn't notice a single improvement from 2 gig to 3 or 4. I would say that right now, you are safe with 2 gig. Now, if you bring in the question of 4K, that is very different from driving a [1440p] monitor. At 4K, I'd say get as much memory as possible, at least 4 gigs. In terms of the question, 'would I be safe for springing for a 4 gig variant?', I think the answer is definitely yes. But, I am not sure that you will really realize any benefit from it. You'd be safer, but I don't know if we are going to run into a limit. That's a tough thing to say, because it really comes down to how developers choose to use the PC. That's a guess. We're not to the point yet that they aren't using 4 gigs, but that doesn't mean we won't be in 2-3 years. The consoles have unified memory, and they have to use that for everything that they do. It's probably unlikely that they are going to use more than 2 gigs just for what would the role of display memory uses in a PC discrete video card, in part because the consoles are targeting 1080 resolutions, so you only need textures so big, etc etc. But, it is possible with 8 gigs on the PS4, they could make a very pretty game that is really smart, and use more than 2 gigs for assets. It's a guess, but that is a factor. Share this post Link to post Share on other sites
Forbin Posted September 24, 2013 My theory: O + O = Steam streaming between peers Based on the new family sharing options recently, I think what they're going to do is allow you to use the same API that streams on your LAN between your Desktop and the box, to possibly stream your friends games. I have a feeling they're going to try to replicate the way Sony and Microsoft are offering instant play of games while the content is downloaded in the background. I can't imagine how that would be anything but the slowest thing in the world, unless Valve started running an incredibly expensive array of server farms. Never mind that's crazy. Share this post Link to post Share on other sites
Uncle Hank Posted September 24, 2013 O + O ...looks suspiciously like a pair of glasses, don't you think? Share this post Link to post Share on other sites
tegan Posted September 24, 2013 it looks like an owl hoot Gabe Newell arrives on stage swarmed by dozens of owls "Half Life 3 is real. It is turn-based, and the only means of input is writing your next action down and tying it to the leg of your Steam Messenger Owl™" Share this post Link to post Share on other sites
Gormongous Posted September 24, 2013 Gabe Newell arrives on stage swarmed by dozens of owls I really hope it's something like this: Share this post Link to post Share on other sites
jeremywc Posted September 24, 2013 I'm personally disappointed by the announcement. I've no interest in Steam OS as my laptop (and almost undoubtedly any future laptop I get) has HDMI out already. I don't need a dedicated HTPC when I can just plug my laptop in (and when I end up getting a dedicated console at some point or another anyway) and I don't particularly want to stream games from and to since hey, input/output lag is still lag, even over a local network. Hey Valve where's a game, ANY game? You do still make games right? I don't think Valve has any interest in SteamOS being used as a general purpose OS. There are a lot of companies using the Linux kernel to handle the low level stuff like hardware, networking, file systems, etc. and then layering a customized userland on top to build a niche OS. Their strategy sounds closer what Google has been doing with ChromeOS: building focused experience that does a few things really well instead of building another generalized OS and competing with Microsoft / Apple. Share this post Link to post Share on other sites
Forbin Posted September 24, 2013 O + O ...looks suspiciously like a pair of glasses, don't you think? hmm interesting.... Based on Carmack's last keynote, Abrash has been working on a pair of glasses other than the Occulus Rift. Share this post Link to post Share on other sites
Cigol Posted September 24, 2013 hmm interesting.... Based on Carmack's last keynote, Abrash has been working on a pair of glasses other than the Occulus Rift. Must have missed that part, did he go into any detail? Share this post Link to post Share on other sites
Badfinger Posted September 24, 2013 Ehhh, I'd move to at least a six core Athlon, next gen games should be heavily multithreaded (8 cores man! 7 useable by games) and the big thing is video ram, which devs are gonna eat up if it's a high end game. GTX 760 4 gig would be much better. I'm personally disappointed by the announcement. I've no interest in Steam OS as my laptop (and almost undoubtedly any future laptop I get) has HDMI out already. I don't need a dedicated HTPC when I can just plug my laptop in (and when I end up getting a dedicated console at some point or another anyway) and I don't particularly want to stream games from and to since hey, input/output lag is still lag, even over a local network. Hey Valve where's a game, ANY game? You do still make games right? I really cannot recommend following advice to move to more than 4 cores until there is proven need for more than 4 cores. You are also not going to be building that $500ish sweet spot machine by overspending on processors. Share this post Link to post Share on other sites
Flynn Posted September 24, 2013 Must have missed that part, did he go into any detail? Mostly that Valve finally decided to back VR instead of AR, as evidence by them firing everyone involved with the CastAR project. He said AR is great but practical options might be 10 years away, whereas VR is here now. Share this post Link to post Share on other sites
JonCole Posted September 24, 2013 I think it's pretty clear that specs, including multiple cores and video RAM, will matter in the next-generation. That said, not every person wants to play the most cutting-edge games at the most mind-blowing graphics. Particularly people on this forum, I'd expect, who play low fidelity games like Gunpoint or Gone Home and appreciate a good concept and good design over brain-blasting graphics. EA is asking for a 6-core or greater AMD CPU/4-core or greater Intel CPU + 3GB of video RAM for Battlefield 4. I'm not going to have all that when BF4 rolls around, so I'm going to buy it on PS4. Very interested to see what AMD is going to be announcing tomorrow, will play a big part into what kind of GPU I'll upgrade to from my 2GB 7870 - http://www.anandtech.com/show/7344/amd-announces-2014-gpu-product-showcase-webcast-sept-25-3pm-edt Share this post Link to post Share on other sites
Frenetic Pony Posted September 24, 2013 I really cannot recommend following advice to move to more than 4 cores until there is proven need for more than 4 cores. You are also not going to be building that $500ish sweet spot machine by overspending on processors. Err, because we know the PS4 and Xbone have 8 cores? Yeah, that was easy. And that's mostly if you're back on a AM3 socket, as Phenom stuff is pretty old. If you're on Sandy Bridge+ for Intel a quad core should do fine as their IPC is a heck of a lot better. I know it's not your intent, but the internet is currently filled with "Hardwre gurus" giving out advice on how to build computers for next gen games, and their only knowledge is a couple benchmarks from HardOCP based on cross generational games built with last gen consoles as the min spec. Rather than, let's say, the Xbox One and PS4 in mind. I'm just hoping to give people better advice than that, as Jon Cole pointed out all you need to do is look at Battlefield 4's recommended specs to see something close to the lines that people SHOULD be thinking about. Also, is there more announcements from Valve? Is that what those pictures are about? I know Abrash has been experimenting and even built a prototype, it's been fun interacting with all those people on his blog, he seems like a really cool guy. But I also know one of the people that left Valve recently was complaining about how impossible it was to actually get hardware built there. And knowing "Valve time" them having a prototype doesn't mean they're going to release anything soon, or ever. One can hope though. Share this post Link to post Share on other sites
mkenyon Posted September 24, 2013 I know it's not your intent, but the internet is currently filled with "Hardwre gurus" giving out advice on how to build computers for next gen games, and their only knowledge is a couple benchmarks from HardOCP based on cross generational games built with last gen consoles as the min spec. Rather than, let's say, the Xbox One and PS4 in mind. I'm just hoping to give people better advice than that, as Jon Cole pointed out all you need to do is look at Battlefield 4's recommended specs to see something close to the lines that people SHOULD be thinking about. This is way off topic, but it's rather disingenuous to say that you have the answers, base them on speculation, and then deride people who have empirical evidence to back it up. I get what you're saying, as things may change, but I find the conclusions a bit simplistic at best. "Those CPU's have 8 cores, therefore having more cores is good!". When the total power of said CPUs is more in the realm of a single Intel Sandy/Ivy/Haswell core, it feels like jumping to conclusions. I know it's hard to say, but "I don't know" is the only real recommendation we can give at this point Share this post Link to post Share on other sites
Twig Posted September 24, 2013 Abrash started out working on augmented reality (like Google Glasses type stuff), I think, before making the move to VR with the Rift. Did he actually build a prototype? They've been using the Rift for all their VR stuff, as far as I know. Or was the prototype for his AR work? It has been a while since I read his blog, but it looks like it's all about the challenges of VR. Nothing new since July. Jeri Ellsworth, who was one of those fired/let-go/whatever during the big hoo-ha... She was working on augmented reality before being fired. She's the one, I think, that Frenetic Pony is referring to, complaining about how hard it was to make hardware. She's since taken what she was working on at Valve (with Valve's permission) and started her own company. (Found the article I remember reading that from: http://www.wired.com/gamelife/2013/07/wireduk-valve-jeri-ellsworth/) Share this post Link to post Share on other sites
JonCole Posted September 24, 2013 Badfinger's budget point is still sound, however. I'm speccing out a new computer for my girlfriend because the amalgamation of old parts from my old computers is proving to be unstable and I just don't like the idea of her being the hand-me-down recipient. I don't have $1000 to blow on it. She also doesn't play most first-person games or shooters, mostly stuff like Guild Wars 2, X-Com, Civ, etc. So, I'm going to make her a computer that does what she needs it to do for the near future and I'll update it as needed. I am at least getting her a 2GB 7850, which should be adequate far longer than something marginally cheaper with a 128-bit bus and 1GB of video RAM. Not everything needs to be wildly futureproof, it just needs to be future-wary. Otherwise, only rich people would play PC games. Share this post Link to post Share on other sites
Frenetic Pony Posted September 24, 2013 This is way off topic, but it's rather disingenuous to say that you have the answers, base them on speculation, and then deride people who have empirical evidence to back it up. I get what you're saying, as things may change, but I find the conclusions a bit simplistic at best. "Those CPU's have 8 cores, therefore having more cores is good!". When the total power of said CPUs is more in the realm of a single Intel Sandy/Ivy/Haswell core, it feels like jumping to conclusions. I know it's hard to say, but "I don't know" is the only real recommendation we can give at this point I'm a graphics programmer as a hobby, who talks with industry pros, I'm pretty sure I know what I'm talking about. I wasn't trying to be disingenuous, but somehow I always end up sounding like it. I just really enjoy learning about... well everything. Including chip design and game programming. It's just sort of what I do for fun. Digging up "references" to support my arguments, when they're just built on years of knowledge and experience is always rather hard. If you do want references I can point to one of Crytek's primary graphics researchers being rather excited for 8 gigs of video ram in the new consoles (hell he was one of the guys that pushed for that much in the first place). Or just how games are made to begin with, more memory is not so much a "problem" as an "oh thank gawd we don't have to spend months and months optimizing and our artists won't get mad at us for compressing their beautiful work into mud anymore". I.E. I know what the specs of the PS4 and Xbone are. I know what programmers and artists do with consoles. I know what generally changes over time with respect to how games use consoles. So while I'm still guessing, it's a highly educated guess. It will be interesting to see what "minimum" specs are. That's a very different argument from "recommended" specs, as scaleability becomes easier the higher up the visual fidelity ladder you go. But for recommended specs, a lot of video ram is good, something more than Radeon 7770 is good (as that's basically what the PS4 already has, except with a 256bit bus). More cores is good IF you are on an older platform, because games will able to make use of it as the new consoles have 8 cores and thus games are going to be able to use up to 7 logical threads at a time (There are reserves for the OS now). In the end I just want to give people correct advice so they don't end up with computers not as good as they could have been. I don't even particularly want to "argue" anything here, just give the best advice I know. Share this post Link to post Share on other sites
mkenyon Posted September 24, 2013 I feel ya, and I don't think you are incorrect. I think it's just too early to know for sure. Heck, will any of the initial games even be built on new engines? I, for one, can not wait for the release of whatever UE4 game comes out, just so it can start to answer questions of what to recommend people! Share this post Link to post Share on other sites
Frenetic Pony Posted September 24, 2013 I feel ya, and I don't think you are incorrect. I think it's just too early to know for sure. Heck, will any of the initial games even be built on new engines? I, for one, can not wait for the release of whatever UE4 game comes out, just so it can start to answer questions of what to recommend people! UE4 looks great! I want to know what they did for their Infiltrator demo, as the reflections look great and possibly realtime, and I know they dropped their voxel cone tracing for it. Maybe next GDC they'll say. I wish other next gen games had their priorities. I wonder what Fable: Legends looks like. It's using UE4 and is apparently basically "Fable 4: The MMO" despite what they "revealed" at E3. Which was like nothing, I guess they were just forced to say "Something Fable!" because of the Xbone launch. But as a huge Fable fan I think they shouldn't have said anything at all, because I got the impression it was primarily some weird tower defense thing rather than anything actually Fable at all. Wow that was off topic. Share this post Link to post Share on other sites
darthbator Posted September 25, 2013 I mean in 2 more months we'll have super concrete details about how PC games perform versus the new consoles. It should be fairly obvious now is a BAD time to upgrade. IMO it's usually good practice to wait for the GPU release cycle AFTER the consoles come out so you can assure you grab yourself something that is a decent order of magnitude faster then what is in the consoles without totally blowing the bank. Share this post Link to post Share on other sites
Forbin Posted September 25, 2013 Carmack also mentioned that Abrash had built a VR prototype using Samsung(?) OLED screens from two phones, and it had shown him that some element of those screens was important. I don't think it was refresh rate, but I think I remember him mentioning something about persistence. Anyways, it sounded like they were getting more serious working through some of the VR problems. edit: i think it was judder http://blogs.valvesoftware.com/abrash/down-the-vr-rabbit-hole-fixing-judder/ Share this post Link to post Share on other sites
Badfinger Posted September 25, 2013 I guess my biggest thrust is the first rule of computer building is you can't future proof, and you are throwing bad money after good if you attempt it. The second rule of computer building is also that you can't future proof. If games crush it with an 8 core processor then go hog wild, but games released this year have run extremely well on hardware from a year to 18 months ago. I'm still rolling strong on an i5. We're right at the beginning of the transition from 8 year old hardware to shiny new hardware. Also I am excited for this afternoon! Tell me about the boxes, Gabe. Share this post Link to post Share on other sites