About a year ago at work, I happened to overhear a pair of gamer co-workers excitedly discussing then upcoming releases. They were young guys about my age, and, from their dialogue, I was able to determine that they were fairly informed hobbyists, who spent a lot of time playing video games, but maybe didn't play a large number or variety of titles. One of them was telling the other of his intention to pick up EA's Army of Two during lunch break. He had been anticipating the game's release for some time, and it had just arrived in stores that very day. As it turned out, I was not the only one listening in; at that point, a female co-worker, mid-thirties, apparently felt compelled to toss her hat into the ring.
"Army of Two?" she inquired. "My boyfriend said he's not getting that one because it only got a 7. He only gets games that are rated 9 or above."
From previous conversations, I had gathered that the woman's boyfriend was a younger man, about ten years her junior, and a fairly avid gamer. She herself was less so, but she was trying, and, on more than one occasion, she had mentioned enjoying co-op Lego Star Wars on the Wii. During the tense moment of silence that followed her interjection, I pondered the circumstances under which she would have been discussing with her boyfriend what random games he wasn't going to buy and the reasons why not. I concluded that theirs must have been indeed a strong and healthy relationship.
Her comment having drained the conversation of its energy, the man being addressed (accused?) seemed a little taken aback before, at last, he calmly responded, "I've been waiting for this one for a while."
"Oh, no, I didn't mean--," she stammered out apologetically. "I'm sorry. I hope you have fun with it."
But the discussion was not over.
"A lot of games don't get 9s," the man pointed out, his normally collected tone just barely lapsing toward audible defensiveness.
"Yeah, please, let me know if it's good," she said. "My boyfriend's getting Super Smash Bros. That one got a 9. Are you getting it?"
Then the discussion was over.
From my perspective, it was a comically awkward moment that I didn't think too deeply about until, many months later, I myself became engaged in a similarly stupid discussion about video game review scores.
It was around the time that Tecmo released Rygar: The Battle of Argus for the Wii. Reading the reviews, I was disappointed to find that it was merely a port of the PS2 Rygar: The Legendary Adventure from six years ago, with the only obvious changes being, in my opinion, for the worse.
Discussing the sad news with someone who had never played the PS2 Rygar, I pointed out that the original was "actually a really good game." For some reason, I then felt compelled to clarify by calling it a "solid 6/10," which, as it turned out, was a lower score than Nintendo Power's for the Wii release.
Perhaps my statements appeared in contradiction with one another, and one conclusion was that I simply enjoyed crap games. I preferred to think that I had a finer appreciation for titles that had good qualities but that, for various reasons, fell outside the "9 or above" must-play range. It was also likely that I used a different scale than my accuser. I hadn't yet drawn my baseline, so it was presumptuous to assume that a 6/10 to me meant the same as a 6 to Nintendo Power or anyone else. Maybe another player could have liked the game to the same degree, but then given it an 8 instead.
The issue is larger, however, than my two silly anecdotes. On the day of Killzone 2's release on the PS3, there's little escaping the tumult over the game's scores, which, thus far, have been very high, with the notable exception of Edge's 7/10, which has incurred the wrath of legions of PS3 fanboys who seemingly need it to be canonized as the best shooter ever. It may be the most contentious case yet, but this is not the first time the press and the mob have butted heads over game review scores. Concerns over the meaning and value of the standard graduated scoring system have been brewing for a while now, and one would certainly be justified in thinking that a reckoning is in order.
As a mere consumer, albeit a hardcore one, who owns all of the current-gen platforms, I thought I'd give my perspective on the matter. I won't be discussing Killzone 2, because I frankly don't care about the game, nor do I intend to address the Edge review specifically, since I, like most people, haven't actually read it. Rather, I'm just going to record a few thoughts on scores in general.
I personally don't think attaching a number score to a review is a problem in itself. The enthusiast press may resent having to do so, since it potentially encourages busy readers to skip the actual content of the reviews, but, then again, I generally don't think video game reviews should be mistaken for award-winning journalism. People like my female co-worker and her boyfriend, informed but not quite hardcore consumers, just want to know whether they should spend their hard-earned cash on a new game. If a game gets a 9 or above, then that's a strong recommendation. If it gets a 7 or an 8, then more research may be in order, but, for the aforementioned couple, the sensible course is just to go for the 9s first, then worry about the 8s if there is still money left to burn. Of course, other factors, such as genre and age-appropriateness, also play a part, but, generally, for people who don't spend a lot of money or time on games, scores help direct them to the must-have titles which will probably provide as much gaming as they need.
That's what the numbers mean to "informed but not quite hardcore consumers." What do they mean to a "hardcore consumer" like me? Well, scores can affect my buying habits as well, but only under certain conditions. When a game scores significantly higher than I expect, which doesn't happen often, then I become intrigued. Otherwise, I usually know whether or not I want to play a game long before the scores come out. And, yes, sometimes I do knowingly go for the mediocre ones.
Yet it is precisely the extreme hardcore gamers, the ones least likely to actually base their purchases on reviews, who are typically the ones most invested in the scores. It is usually the case that these fans really just want some sort of "official" grade to validate their own opinions about games they may or may not have even played but have, at any rate, committed to paying good money for.
While reviews are inherently subjective, enthusiast channels such as IGN and GameSpot don't clearly convey that, instead often presenting them as objective and authoritative. Most readers don't even seem to recognize that these websites employ a number of different staff and freelance reviewers, and the sites themselves must take some of the blame for that, as, beyond the tiny bylines, they make little effort to connect the reviews to any individual personalities. Print publications like Nintendo Power are actually a little better in this regard, but that medium grows increasingly irrelevant.
Another issue, upon which reviewers and fans seem to agree, is that, while the number-based system is helpful for consumers, it can devalue gaming as an art form. Considering how expensive a hobby gaming is, it's fair enough for many consumers to limit themselves to the "9 or above" titles. But adherence to such a harsh standard can rob a player of many of gaming's finer experiences in those 6-to-8 titles that are less polished but sometimes more inspired. Some may feel that it is the reviewers' responsibility to cultivate a higher class of gamer by promoting these flawed but noteworthy titles, but simple numbers cannot convey their subtler qualities to those many consumers that don't have time to read the full reviews.
I've at times thought that it might make sense to reform the system into something more akin to the current scoring system for figure skating, which assigns a specific base value to a routine before judging its execution. So, then, Rygar's 6 might be out of a total of 8 possible points, while, hypothetically, a similar but more ambitious title like Devil May Cry might be an 8/10 from the same reviewer. After all, does it really make sense to judge an annual sports title like MLB: The Show according to the same scale as Metal Gear Solid 4, which aims so much higher? Of course, my own bias is showing, and, ultimately, there's no objective way to determine a game's base value.
No, changing the scoring system would only confuse people, while doing away with scores altogether, as some have suggested, would defeat the purpose of the reviews for most consumers. It would just realign the audience to an ever more hardcore slant.
The commonly used ten-point scale is fine by me, though I think reviewers should more actively make efforts to clarify what the numbers mean, maybe attaching some sort of stock explanation and disclaimer to every score, because, while most fans ought to understand that the final score is subjective, it's less often recognized that so is the scale itself, as became evident to me with my Rygar anecdote.
By my scale, for example, a 7 would not be an average game. An average game would be a 5, right in the middle of the scale. The thing is, however, that a savvy gamer, though better-informed, is actually less likely to be exposed to bad games than a casual gamer, because the really bad games are things most discriminating players have likely never even heard of, let alone played. In my personal collection of hundreds of titles, I can't think of more than two or three titles that I would score below 5, but that's because, even though I may buy a lot, I try not to buy garbage.
Among games that actually get press, 7 might seem like an average score, but it's important not to let that skewed perspective make us less appreciative of what are actually very fine titles, with fair scores that reflect that. There are other games that I would rate higher than Rygar by as many as four grades, but I still think it's a good game.
For reference, the worst game in my collection, Stunt Race FX, would probably be about a 2/10 on my scale, and here's how it might break down:
Can the game be played? No.
Does it have graphics? Not really.
Anything else? Some good tunes.
Thus, I grant Stunt Race FX a 2/10.
Why isn't it a zero if it's really the worst game I've ever played? Well, for me, a zero would be something that crashes on the title screen, while a 1 would be a game that makes it past the title screen but then crashes with the next button press. I don't have specific examples of such games, but I leave those slots open just in case.
Also, each number in the scale itself should be understood to represent a range, with a 10 being, not a perfect game, but a game of the highest caliber.
Of course, I acknowledge that Rygar and Stunt Race FX are really old games and, therefore, terrible examples. As I've said, reviews should be meant to help guide consumers, and that usually only matters while the games are new. Once history grants a title "classic" status, it doesn't really matter where specifically it ranks.
Finally, back to the matter of scores directing sales, things become more problematic when you factor in the aggregate review sites like Metacritic and GameRankings, which have become enormously influential within the industry. In collecting all the press reviews and distilling them down to a single combined score, these sites do great harm to the craft of criticism. While these aggregate scores are generally accurate, making them a useful and convenient resource for consumers, in making things so easy, they are also the biggest culprits when it comes to number scores cheapening the artistic depth of gaming.
Some publications have experimented with non-number-based scales, perhaps to undermine Metacritic's score conversion system, but that only makes the site less reliable, not less influential. Rather than trying vainly to combat the system, reviewers should just acknowledge that their job is essentially to tell people which games to buy. So, if a title is worth getting despite any flaws, it should get a high score that elevates its profile against competing titles.