Sign in to follow this  
Lama Himself

Metacritic - Number objectivity?

Recommended Posts

In the latest Edge (issue 199), there is an interesting article about metacritic.

I was surprised to learn that Metacritic have an editor and just one. I always thought that the website was based on idiotic formula (average or more fancy) that would take content (in this case reviews) and generate an overall score thanks to the number objectivity (or in some case mass objectivity).

Well the website seem to be like that except that the Metacritic editor select the magazine or website that will be taken into account but also decide to weight this publication depending on his opinion. Basicaly Edge will have more weight than a fanboy website. I'm fine with the selection and that a publication can have more or less weight, but I'm puzzled to think a single editor have to decide. Doesn't it ruined the whole system?

Of course, the article talk also of the metacritic bonus that some publisher use as a carrot for some developers.

Braben argues that the real success measurement is the sales number and royalties are already there to reward the developer for successful games. So why rely on other system like Metacritic?

Well in fact a publisher already offered us this kind of bonus in the past. And I would have been quite happy to accept it if the project would have been greenlighted by marketing (damn marketing!!!). Let's face it, most of the developer don't get royalties. But I think the point they miss is mainly branding. A publisher give a Metacritic bonus to help the title but also their brand. Even if the game is a commercial failure, having good reviews is always good for a publisher.

It can be strange to a lot of developers to ask them to make good games, but I see more and more developers that don't care about quality. Their goal is just to finish a production in time with the allowed budget, that's it. So in these cases a small quality bonus is welcome.

Share this post


Link to post
Share on other sites

Well the website seem to be like that except that the Metacritic editor select the magazine or website that will be taken into account but also decide to weight this publication depending on his opinion. Basicaly Edge will have more weight than a fanboy website. I'm fine with the selection and that a publication can have more or less weight, but I'm puzzled to think a single editor have to decide. Doesn't it ruined the whole system?

They do that for their movies and music as well, and the confusing thing is you as a reader have no clue which sources they deem more "worthy" than others. The Metacritic scores turn out to be more of the editor's opinion then an exact science. This is even more true with other iffy decisions made on the site (For instance, I'll commonly see a review of a game that was specifically stated to be a review of one version, and yet the score is taken into account for all platforms. Even worse is how sometimes media companies re-use content (like 1UP/EGM or PC Gamer UK/US) which results in Metacritic sometimes counting the exact same review twice.

In addition to the above problems, I don't like the idea of providing bonuses based on Metacritic scores because sometimes the quality of a game is out of a developer's hands. The publisher usually determines the budget, resources, and development time, and sometimes even major gameplay or creative decisions. If for instance, the publisher insists that a developer make their game more generic and mainstream, and then forces them to release it early for their Q4 results, I think it's even more unfair to then dock their income because reviews were less then favourable. Not to mention the fact that maybe reviews don't hit a certain mark due to reviewer preferences and not any particular quality aspect.

Share this post


Link to post
Share on other sites

ok, so that makes Metacritic even more irrelevant than it already was.

Share this post


Link to post
Share on other sites

I read the article the other day, it was an enlightening read. I've never trusted Metacritic due to the clandestine way it weights a review's worth, which as mentioned is all based on the opinion of just one of person (which is backwards to what the site is trying to achieve). It's a shame that publishers do put so much emphasis on the results of such an unvalidated formula, and it's probably frightening to think how many times Metacritic pops up on powerpoint presentations in meetings at various developers and publishers.

Off topic...

Marek, are you the same Marek from Adventuregamer(s).com?

Share this post


Link to post
Share on other sites

I'll apologise in advance then...I used to work for Telefragged :) Just as a writer though, only realised about their dodgy dealings when they locked me out of all the sites I updated and didn't pay me my last two months of wages.

Share this post


Link to post
Share on other sites

Hah wow. :) Yeah that sounds like TeleFragged alright. I got horribly fucked over by them back in the day. Eventually they hijacked AG's original domain name by hacking into our registrar's account and then sold the domain to a games publisher for loads of cash. Fun times. :tup: I guess we're fellow victims then.

Share this post


Link to post
Share on other sites
I could post my opinion in this thread or I could just link to an old editorial... I think I'll be lazy:

http://www.idlethumbs.net/display.php?id=75

It sums up pretty well my feeling about it.

As a consumer I'm still curious to check the score for hyped games, but the system is not reliable enough to be used in the industry.

In addition to the above problems, I don't like the idea of providing bonuses based on Metacritic scores because sometimes the quality of a game is out of a developer's hands.

Well the developer have more influence on the quality of the game than the sales which is often driven mainly be the advertisement and retailer relation. I know some developer that involve themselves in the buzz of their game, but usually this is 100% handled by the publisher.

And having a Bonus doesn't hurt anybody in theory.

Share this post


Link to post
Share on other sites
...only realised about their dodgy dealings when they locked me out of all the sites I updated and didn't pay me my last two months of wages.
Yeah that sounds like TeleFragged alright. I got horribly fucked over by them back in the day. Eventually they hijacked AG's original domain name by hacking into our registrar's account and then sold the domain to a games publisher for loads of cash.

This sounds interesting. Is there any general write up about why they came to be known as dodgy? Or is it just industry canon?

Share this post


Link to post
Share on other sites

Actually, in the long run, Metacritic is only 1.2 points more conservative on average than GameRankings. In 2008, 19% of their ratings were effected by their weight system, compared to about 70% in 2001 so things are much tighter now.

The thing is, we may not agree with the industries selection of a scoring index, but they do need one. If they just used sales numbers as a marker of quality vs. a scoring index, I think we'd be even more disappointed.

If you want some more insight here, check GameQuarry.com and look for the video report on their weighting system. You'll also find a video report comparing Metacritic to Gamerankings and an article about how the Video game industry needs a new scoring index, and what that might look like.

I'm an industry scoring analyst and I really couldn't care which index is used as my job is the same. And yes, I'm one of the those guys creating powerpoint presentations for the industry. I too have issues, so I'm not trying to change anyones minds here, just providing some context.

Here are some links if you are interested:

Comparing Metacritic to Gamerankings

Uncovering Metacritics Weighting System

An Alternative Scoring Index for the Gaming Industry

Cheers,

Tim

I read the article the other day, it was an enlightening read. I've never trusted Metacritic due to the clandestine way it weights a review's worth, which as mentioned is all based on the opinion of just one of person (which is backwards to what the site is trying to achieve). It's a shame that publishers do put so much emphasis on the results of such an unvalidated formula, and it's probably frightening to think how many times Metacritic pops up on powerpoint presentations in meetings at various developers and publishers.

Off topic...

Marek, are you the same Marek from Adventuregamer(s).com?

Share this post


Link to post
Share on other sites
This sounds interesting. Is there any general write up about why they came to be known as dodgy? Or is it just industry canon?

Not that I know of, just the odd forum post by people that they screwed over. It was basically ran by four guys who had no business sense, one of which was a bit of a megalomaniac..and a linux fan. The unfortunate thing is they're still going under the name Atomic Gamer and continue to host official sites for id and Raven.

Anyway, not to go off topic. Thanks for posting those articles tsweez, they were an interesting read/watch...and very thorough! :)

Share this post


Link to post
Share on other sites

The thing is, we may not agree with the industries selection of a scoring index, but they do need one. If they just used sales numbers as a marker of quality vs. a scoring index, I think we'd be even more disappointed.

OR, they could accept that it is impossible to judge games quantifiably, and use the qualitative judgments of good critiques instead.

But business people are even worse than scientists when it comes to demanding a number for everything :tdown:

Share this post


Link to post
Share on other sites
Actually, in the long run, Metacritic is only 1.2 points more conservative on average than GameRankings. In 2008, 19% of their ratings were effected by their weight system, compared to about 70% in 2001 so things are much tighter now.

Wait—this isn't a joke? Even if things were tighter, should Metacritic scores remain as important as they are now?

I understand that, in business, "what gets measured, gets done," but there must be better ways for publishers and developers to discover and compare the critical success of their games, ways that do not misinterpret or oversimplify. High Metacritic scores should not be confused for, nor accepted as, critical success (link to further ranting on measures and "key performance indicators").

Share this post


Link to post
Share on other sites

I completely agree, the tighter the better. The point that I'm trying to make is that if the industry wants to use an existing index, Metacritic is just fine as the point spread is negligable overall.

The challenge for those of us who try to figure out trends over time, is that MC's weighting system has changed dramatically (for the better). Still, it's likely not appropropriate to use the weighted numbers for a historical analyis as you are talking apples and oranges. The highest deviance between the MC Score and the actual average of the media outlets is fourteen points, but that was years ago. You can imagine how plotting genre, platform, developer, ESRB, or publisher performance could get wildly skewed.

The point I try to make in my article is that the industry needs an index. We all do. Granted, for the season gamer, their mashups are often in their heads vs MC or GameRanking. They know which outlets to trust, and which hold more weight. The game companies needs trusted sources too, but within the industry their are different applications for those numbers.

For example, I read some opinions why using MC is a joke because the rating doesn't apply until after the game is released. Not true. As a game developer you start running a series of mock reviews once it hits alpha or so. These mock reviewers need to communicate what's good and bad about the game. While the document may be 10's of pages long, end the end, it's their Metacritic assumption that carries a good deal of influence. If the publisher thought they had a game worthy of 80, and the mock reviewer says no, it's really a 70, then that affects the projected revenue for that title. Rarely are publishers surprised at a MC score when the game is released.

What the industry often doesn't recognize is the nuances of the data. Thats really what I try to do is give them context and advocate a more appropriate solution.

Later Gators,

Tim

Wait—this isn't a joke? Even if things were tighter, should Metacritic scores remain as important as they are now?

I understand that, in business, "what gets measured, gets done," but there must be better ways for publishers and developers to discover and compare the critical success of their games, ways that do not misinterpret or oversimplify. High Metacritic scores should not be confused for, nor accepted as, critical success (link to further ranting on measures and "key performance indicators").

Share this post


Link to post
Share on other sites

I mostly agree. A good sample of reliable critiques is the best judge of quality, but that's relative. Take casual games for example, or the "Potter" type releases that game trade outlets often pan for their simplicity. But more consumer related outlets like NY Times, or Family Circle, or some syndicated columns i.e. Associated Press may recognise quality differenty.

It might be less about the character archetypes or storyline for them, and more about ease of use. Remember the old game "Black" by Criterion? No story line really, no mutiliplayer, so game trades overall waffled on the quality. On the consumer front, they dug the graphics, and just like shooting stuff. Two very different opinions to quality.

The best you can really do is take the largest sample possible and let all this stuff work itself out. I a perfect world, you'd have a gamer index, and a consumer index which would apply to the audience you are targeting.

And yes, I agree about your observations that often business people want numbers for everything. I have to take pause with some of the requests I get for that very reason.

Cheers,

Tim

OR, they could accept that it is impossible to judge games quantifiably, and use the qualitative judgments of good critiques instead.

But business people are even worse than scientists when it comes to demanding a number for everything :tdown:

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this