Subscribe to
Posts
Comments
NSLog(); Header Image

The 7-10 Review Scale

Penny Arcade hints at it, but the "7-10" scale is a problem with IGN and a lot of other reviews online. It's an even larger problem when you realize that virtually every game is going to get a 9.5 or less because the "9.5 to 10" range is reserved for those truly remarkable, once-a-year-or-less type of games as well as the promise that future games will be better ((Same reason it sucks to be the first skater in figure skating: you can perform a flawless, wonderful routine but the judges feel compelled to "leave room" in case someone does even better.)). No problem reserving the top 5% of the space on a true 0-10 scale - big problem when you're reserving 17% of your working space.

So really, you've got a 7.0 to 9.5 scale. This puts the "average" game at about 8.2 and any game under 7.0 is effectively a negative score. It diminishes any real differences by making the differences so important. It pushes sales because "hey, the game got a 7.4 on a 10-point scale!" It doesn't matter that the 7.4 is roughly equivalent to a 2, it just sounds better.

I've reviewed products for MacAddict using their five-point scale, which is servicable and easy to understand. The middle rating was always "average" and you had two spots above it and below it. The bottom and top ones were, IIRC, "Blech" and "Freakin' Awesome." This means virtually everything got a 3 or a 4, which was fine ((I always wished for MacWorld's mice, because they offered a true 10-point scale, what with their half mice and all. It could really separate the men from the boys.)).

I was Editor in Chief of SegaWeb for a year, and we too used a 10 point scale, but I was adamant that we apply the bell curve to the thing: 5.0 was middle ground and the standard deviation was about 2.0. An above-average game was going to get a 6.x or a 7.x, a very good game an 8.x, and the once-a-year-or-less type games a 9.x. Truly sucky games would get 1.x or 2.x.

A bell curve couldn't ever really apply to MacAddict like it could at SegaWeb, though, because at SegaWeb we reviewed everything that came out. At MacAddict, the scores are naturally going to trend higher because there's no point in publishing a bunch of reviews of crappy software. The only times MacAddict is serving its readers in reviewing software that is rated poorly is when the software is highly publicized. In that instance, they're doing their readers a favor by letting them know to avoid the highly publicized garbage.

At The Sand Trap we don't use a scale at all. Golf equipment is a finer area with a lot more variety in terms of taste and functionality, so we hope to make the reader actually read the review. If we slap a "5.7" on something, people might just look at the score and skip the review. That's not in their interests. Additionally, given the numbers of things that only rate things in the upper reaches of their scale - like IGN's 7.0 to 9.5 scale - readers are conditioned these days to see a score of "6.4" and think "blech!" when really the reviewer may be trying to convey that it's an above-average item.

I'm sure the upward tilting review scale has a lot to do with advertising. It always comes back to the almighty dollar. A score of 7.4 might actually be a horrible score when you know that the scale is 7.0 to 9.5, but the guys making the product can always play off the fact that it was given a 7.4 out of 10, and the customers will perceive that as "hey, that's a pretty good score."

It's bad for the consumer because it simply tightens things up too much to be useful. $60 is a lot to spend on a game, and it's tough to decide where to draw the line. 8.6? 8.7? 9.0? Those numbers are all far too close to each other to really account for individual differences in how reviewers rate games. Instead of being spread out from 6.0 to 10, "better than average games have only roughly one point to play around in: the 8.3 to 9.3 range.

Pffffft. This stream-of-consciousness rant brought to you out of boredom and a severe disgust for the lack of a true 10-point (effectively 100-point) scale. ((And don't tell me it's like 90-100 = A, 80-89 = B, etc. because in that system, C should be the middle of the bell curve, with a whole helluva lot of room left on the bell curve. Heck, I'd be in favor of making tests and grading center around 50 = C and having a standard deviation of 20 or something.))

3 Responses to "The 7-10 Review Scale"

  1. I buy a lot of games, way too many honestly. Several years ago I stopped reading game reviews from the major sites to see if a game was "good". I'd read their reviews only to find out about certain features in the game or what not. I don't need someone's opinion on a game. I've played so many games over my lifetime, I know what I like and what I won't. I specifically stay away from any licensed TV or Movie games unless it's for the kids and I know it's from a good developer, like the LEGO Star Wars series.

    Penny Arcade has actually become my source for unique game purchase decisions. I picked up Eye of Judgment because of their candid discussions about the game. Plus I like CCGs and board games so it's a perfect fit.

    What happened recently at GameSpot over Kane & Lynch just goes to show how messed up reviews of any product can be skewed if there is enough money involved.

  2. [quote comment="44732"]I'd read their reviews only to find out about certain features in the game or what not.[/quote]

    That's what I still do, mixed in with a heavy dose of seeing what "real people" (friends, blog pals, etc.) have to say about games. When demos are available, I try those too (though the 3-minute timer in Bank Shot Billiards is just about worthless).

    [quote comment="44732"]What happened recently at GameSpot over Kane & Lynch just goes to show how messed up reviews of any product can be skewed if there is enough money involved.[/quote]

    That "episode" is what prompted this post. I was reading about that and followed some links around until I ended up at the Penny Arcade link above.

  3. TeamXbox.com has a table with all their game reviews (a tad more than 600 for the 360, graded from 0 to 10 with 0.1 increments), so I made a histogram of it and fitted a gaussian. It's centred on 8.1 with a half width at 1/e of 1.6. It helps a bit 🙂