Sign in to follow this  
Ninja Dodo

Technoholism

Recommended Posts

An interesting article on Gamasutra:

http://www.gamasutra.com/features/20041229/kelly_01.shtml

Technology is great, isn't it?

In our industry, technology has driven massive progress and financial returns, the likes of which were undreamt of 20 years ago. From 'rumble packs' and EyeToy-s to 3DFX cards and analog pads, we are blessed with the fruits of a golden age of, well, stuff. We have come to rely on technology. We await new formats, new capabilities and new toys with something approaching high passion.

But there is a dark side to techno-love. We have come forward leaps and bounds in hardware, but the software lacks far behind. The industry, especially in the console and handheld sectors, remains immature and hardware-led. Games are now much more expensive to develop. Geometric cost rises are not matched by equally geometric rises in sales. Creative game content is not evolving, and release schedules increasingly carry the whiff of déjà vu.

The industry is becoming more the equivalent of an electronic comics industry rather than the usurper of film that it aspires to be. Technology, bizarrely, is now part of the problem rather than the solution. We rely on greater and greater hits, trying to achieve better-than-photo realism through ever-more complicated graphics technology, massive physics systems, and ever more complicated consoles. Yet while we seek these new electro-thrills, we seem increasingly oblivious to our own atrophy, to the arguably failing state of the games we produce, and to our own future.

We are technoholics.

What are we addicted to?

1. We are addicted to new toys. Who doesn't want to play with new hardware and software?

2. We're addicted to a business model that values hardware over software

3. We are addicted to trusting that new technology somehow magically translates into new money

4. Mostly, we're addicted to the ability of new technology to anaesthetize fear of the future. Technology has become the catch-all solution.

Technoholism is a downward spiral. The rate of return for increasing investment thins from one generation to the next. The faster turnover cycle between console generations leads to market fragmentation. The pay-offs in terms of the games themselves are now lackluster. The media and the customers are increasingly cynical about over-inflated promises.

Costs are spiraling. Most developers are not having much success in bringing costs under control, and they know that the situation is only going to get worse. In the UK, over a hundred development studios, big and small, have gone out of business in the last three years. The next round will cull nearly everyone else.

Yet all is not lost. Addiction can be beaten and sanity restored, if we learn to recognize the problem, admit that there is a problem, and then take steps to re-affirm the decision to mend our ways. So, throwing taste and caution to the wind, how better to solve our problems than with a 12-step program?

(...)

Share this post


Link to post
Share on other sites

I am going to read this*. But not now. Tomorrow, rather.

(*the full Gamasutra-article. This excerpt was quickly perused.)

Share this post


Link to post
Share on other sites

Hail to the fellow sutrite.

I think the article is a bit off, especially on:

-Keeping people off PS2 projects, and having them do PSOne games while the market passes them by;

-This line: "Let us imagine that Sony, Microsoft, Nintendo, Nvidia, Intel, and everyone else decided tomorrow that they would never develop another iota of hardware again" (If this happened, BetterStuff Inc would release some piece of kit and dazzle the punters, take the money);

-Declaring music and film to be software-reliant rather than hardware focused is totally off.

-"We decide not to buy into the next round until it is proved to be profitable". It can't be proven to be profitable until someone buys into it.

-That Grim Fandango did well (alluded to in the same breath as Half Life), and didn't need marketting

Share this post


Link to post
Share on other sites

i quite liked the article ... many points made LOTS of sense.

as a programmer, the bit about having sony, microsoft & nintendo somewhat "unite" is definitly a good thing. now i kno they're in competition and all, but at least if they all had a single, unified API that came in a single DevKit, life would be easier for your standard programmer. AND games would be on all 4 systems (pc included) always, since a "port" would be a recompile away ...

... now each system will have its own unique commands (like the 'cube having gba connectivity) but if 95% of the syntax was the same, it'd make developers lives far easier and make game creation far cheaper.

SiN

Share this post


Link to post
Share on other sites
-Declaring music and film to be software-reliant rather than hardware focused is totally off.

Maybe it's not utterly and completely true, but I think it's generally much moreso than in games.

Share this post


Link to post
Share on other sites
-Declaring music and film to be software-reliant rather than hardware focused is totally off.
How so? It's not like I have to get a new CD player every 3 years to continue listening to music. VHS has been around a loooooong time before gradually being replaced by DVD, and it still isn't dead. Film hasn't changed in decades in ways that movie-goers would care greatly about. (Though digital cinemas are changing that.)

Maybe the guy put it pretty strongly but I think it's a fair point. The platforms for music and film are a lot more stable.

Share this post


Link to post
Share on other sites
How so?

In the article, he's talking about the creation of the music or film, not just the media it comes on (He talks about Fender Guitars vs a fictional "SuperObo").

Film pushes hardware just as much as games do; for example: Weta Digital (of Lord of the Rings fame) constructed a massive parallel computing system which ranked as one of the bigger supercomputers for use in graphics for films.

In music we've always had a plethora of hardware based tools (ProTools, hardware cubase and so on) which have only in the last 8 or so years been made fully functional in software forms for PC etc.

Even in instruments (continuing with the guitar example) you have a great deal of advancement in pickup technologies, manufacturing processes, microphone design, etc.

The difference might be because other fields of creation don't figure Moore's Law so centrally in their advancement of hardware.

As for a common API, this would be a great thing, especially as hardware power runs so far ahead of the demands games place on it (in the next gen of consoles, and today on PC) so the overhead isn't going to be much of a problem anymore. Microsoft are working their next generation DirectX10 (I think it's got a different name now) into the XBox2, with an eye towards a common hw interface layer for a number of consoles. Whether anyone else will take to it is a different matter. How many programmer hours are lost porting games to all the consoles? You'd probably lose 4 guys to porting on a project for 3-console coverage (EA has their own way of dealing with this :tdown:)

Share this post


Link to post
Share on other sites
The difference might be because other fields of creation don't figure Moore's Law so centrally in their advancement of hardware.
Ah well, if it's creation we're talking about, I think the difference is what you just wrote, but moreso the fact that you don't need the latest version of ProTools to make good and/or marketable music, or the largest supercomputer ever to make a succesful movie.

To give an extreme example, The Blair Witch Project outsold slews of CGI-heavy films. The music industry is not driven by having the latest hardware either (pop music is mostly driven by marketing, and the 'underground' is driven by quality and/or hip-ness).

I have yet to read the article in question though :grin:

Share this post


Link to post
Share on other sites

Maybe it's a fundamental difference between games and film that drives wonderful things like Eternal Sunshine, Donnie Darko and Blair Witch to great successes, while our own darlings Grim Fandango, ICO and BG&E don't make it so far.

Hopefully we'll see a return of the coding underground in the next 5 years. I'd love to see an end to games publishers as bandwidth cost converges on zero.

Share this post


Link to post
Share on other sites

there's two other articles on gamasutra worth reading too. an interview with Jordan Mechner and Ernest Adams' latest. the technophile argument is only valid because of the relative youth of the games industry and is something that will work itself out imo.

Share this post


Link to post
Share on other sites
"Sometimes science advances faster than it should." - Jenny Holzer, artist, from her Truisms series

This is about markets though, not science.

The science is already there.

I could say a lot against those "truisms" too, but that's a different matter.

Share this post


Link to post
Share on other sites

I haven't read the article (yet), but I think it would be great if software made more advances instead of constant hardware upgrades.

Maybe we'll the trend will change towards that when the common processor speeds are about 10-20GHz, and likewise with the 3D graphics cards.

In any case, standards and reusability in software is becoming more and more important as software (and hardware) gets more complex. I have great hopes that the open source community will develop more and more well-designed reusable components that could be used even in any commercial game. For example S.T.A.L.K.E.R. is already using OpenDE (Open Dynamics Engine I think) -- an open source physics engine. If I'm not mistaken.

When the 3D, physics, sound etc. engines reach a certain maturity, the focus will turn on level editors and the like. Eventually, and I dare say not more than 3-5 years from now, there will be free open source engine(s) with the same easiness of editing as the Unreal engine games, and with the same level of graphics possible as in commercial games of the same time.

Share this post


Link to post
Share on other sites
Maybe we'll the trend will change towards that when the common processor speeds are about 10-20GHz, and likewise with the 3D graphics cards.

no we wont, because im pretty sure people were saying the exact same thing 10 years ago, except they must have said "when the common processor speeds are about 1-2GHz" instead ... havent seen any changes have we?

eventually though, ur right, the focus will change, but it'll be independent from how hardware evolves.

SiN

Share this post


Link to post
Share on other sites

The hardware developement processes have limits. They can't continue improving with the same rate forever. I don't have the theoretical limits in numbers right now, the 10-20 GHz was just a wild guess.

But this quest for photo-realism that is driving the games' demands on hardware will have to slow down a lot some time. And if you compare the graphics of now with what we had with 100-200MHz, you could think that 10-20 Ghz (10GB of memory, 2.5GB graphics memory) would allow for "photorealistic enough". Or maybe not.

(not that I personally care about photorealism that much)

Share this post


Link to post
Share on other sites

Film pushes hardware just as much as games do; for example: Weta Digital (of Lord of the Rings fame) constructed a massive parallel computing system which ranked as one of the bigger supercomputers for use in graphics for films.

The film industry has evolved to the point where, with straightforward technology that hasn't changed that much (CGI excepted) over the decades, they have evolved the ability to create grand visions.

CGI is a small part of cinema which in many cases doesn't even factor into it at all. At the end of the day, if you light something and point a camera at it, you've got film. You don't first need to upgrade to Camera 9.0c to ensure it runs on current projectors.

So you can nitpick all you want but the point of the article is that the games industry needs limitations to be creative with. Tool stability is essential to experimentation.

Share this post


Link to post
Share on other sites

Right. Most movies that aren't CGI-heavy are still certainly going to be using more advanced technology than was used decades ago, but it's fairly transparent from a development and audience perspective. Fine, film stock improves, and there are better computer-assisted camera sytems, but that technology is taken care of at a level pretty much removed from the actual director and screenwriter and actors and so forth, by companies like Kodak. In games, the development team has to be constantly creating the technology itself; sure, we have middleware like the Criterion-developed-but-now-EA-owned Renderware, but teams still have to worry about squeezing as much as they can out of hardware and building better algorithms so that their game doesn't already look dated when it comes out. It's even worse with PC gaming, where developers have to also spend time and effort worrying about the thousands of different possible combinations of hardware players will be using.

It's not a system that's really got creative effort at its center. There are designers who obviously do make creativity the centerpiece of their contribution to the game, but they still have to go through all that other shit first. I think when (if) that situation ever changes, we'll start seeing better games.

Share this post


Link to post
Share on other sites

I thought this was somewhat relevant and I like digging up old threads, so...

The free lunch is over.

This talks about how writing software is going to get more difficult because processor manufacturers can't simply keep rising the clock speed, transistors and cache size at the same rate, but must make software work faster with different methods (hyperthreading etc.).

Obstacles, and Why You Don’t Have 10GHz Today

concurrency-ddj.gif

Figure 1: Intel CPU Introductions (sources: Intel, Wikipedia)

CPU performance growth as we have known it hit a wall two years ago. Most people have only recently started to notice.

....

What’s the clock speed on the CPU(s) in your current workstation? Are you running at 10GHz? On Intel chips, we reached 2GHz a long time ago (August 2001), and according to CPU trends before 2003, now in early 2005 we should have the first 10GHz Pentium-family chips. A quick look around shows that, well, actually, we don’t. What’s more, such chips are not even on the horizon—we have no good idea at all about when we might see them appear.

Well, then, what about 4GHz? We’re at 3.4GHz already—surely 4GHz can’t be far away? Alas, even 4GHz seems to be remote indeed. In mid-2004, as you probably know, Intel first delayed its planned introduction of a 4GHz chip until 2005, and then in fall 2004 it officially abandoned its 4GHz plans entirely. As of this writing, Intel is planning to ramp up a little further to 3.73GHz in early 2005......

But this doesn't mean Moore's Law is over yet:

Does this mean Moore’s Law is over? Interestingly, the answer in general seems to be no. Of course, like all exponential progressions, Moore’s Law must end someday, but it does not seem to be in danger for a few more years yet. Despite the wall that chip engineers have hit in juicing up raw clock cycles, transistor counts continue to explode and it seems CPUs will continue to follow Moore’s Law-like throughput gains for some years to come.

Share this post


Link to post
Share on other sites

We're already at a point where the power of the hardware is outstripping the complexity of the software we're writing for it; we're just not that good. The "next big change" in how we do stuff is most likely going to be the use of more automated content creation (much more texture generation, organic-style model generation (eg "growing" trees; this is being used already but only at the development stage). As more and more hardware is made free (Moore's law is exponential, coding complexity is probably nearer to linear) more of it is going to be used for (what was previously prohibitively wasteful) powerful world-generation stuff. More powerful processors won't be market-viable until the automation tools are at a level where you could make a top-tier "blockbuster" title with them; the killer app to sell the 10GHz processor.

Share this post


Link to post
Share on other sites

yeah i really had to wait a loooong time till something came up that made me wanna upgrade my system (HL2 and Doom 3... oh and FarCry) so i think that is a real problem and it's getting worse! i know that we just witnessed an awesome christmas in terms of game releases but still the question remains, what is going to be the next awesome game to push the envelope! i don't really see it, any ideas?

Share this post


Link to post
Share on other sites

i'm not! where am i discussing upgrades? i am talking about technology racing seemingly blindly ahead and the only thing on the horizon is the next title from id software or the next unreal engine (3). so the question for me is why i would even get all the stuff if it will take another 2 years (ok pbly less) till you get all the performance out of the hardware. my point being, is this really efficient or is this just consumption-driven marketbehavior at its worst!

Share this post


Link to post
Share on other sites
Guest
This topic is now closed to further replies.
Sign in to follow this