Metacritic, part II

Why does Metacritic get so much attention in the games industry and in the games industry only? Sometimes I forget that there are also four sections on the site that cover movies/DVD, TV, music and books; the last section proved to be so unpopular that they took it off during their site revamp. Is the movie industry too rich to care about such piddling internet services? Is the music industry too caught up in piracy to care about the quality of its products? Or, does the game industry care too much in an attempt to make sure that all games are quality?

The answer is somewhat different. Metacritic does not only capture consensus, but also rates it to a degree. Such precision is only relevant to those who expect one, and such expectations mean that a consensus is formed before Metacritic even starts recording. When it tries to rate a non-consensus on a movie or an album, it either creates a meaningless number or has to start making stuff up to ensure that their Metascore can even function the way it does.

  • In order to see the difference, let’s compare Metacritic’s movie section with Rotten Tomatoes, a competing movie review aggregator. A quick look sees that Rotten Tomatoes aggregates a lot more reviewers, have more user reviews, and also has a completely different way of score aggregation. Rotten Tomatoes’s “Tomatometer” does not measure the level of consensus, but shows if there’s a consensus in the first place. A look at the “consensus” section on a movie’s page shows that that’s not exactly what the site wants you to believe, but it’s the defining characteristic that makes the Tomatometer successful while the Metascore is not.

    Reviewers of established media does not rely as much on the score as game reviewers do. Take the New York Times reviewers; they have two levels of scoring, one for “critic’s pick” and one for everything else. This isn’t a big problem for Rotten Tomatoes, since human intuition can tell whether the review is positive or negative. But what about Metacritic, who has to put a score on one of the most established review source in the industry? The site literally makes them up. So much for priding itself on knowing the difference between a 70 and a 71!

    But this is a minor problem compared to the essential dilemma of consensus. Rotten Tomatoes does not predict that a consensus will emerge. Instead, they aggregate reviews and see whether or not there is a consensus, then rate the level of consensus in abstract levels. If 60% of reviewers think a movie is good, then it’s “fresh”, and rotten for everything else. And the part about whether or not a consensus emerges is crucial, because movies, like good art, is always under an artistic debate. Take a movie like Vampires Suck, a parody of a genre that was panned by critics except for one, who says that the directors “tapped the vein more effectively than their norm.” Of course there’s those who criticize that he was bribed, but to go against the stream like that takes a lot of courage.

    Would Metacritic allow a reviewer who praises Stalin vs. Martians as a parodic masterpiece, or would they not include him because of his “unreliability” and difference from the mainstream? The Tomatometer allows for some anonymity and some opinion, but Metacritic makes sure that the consenus does happen and throw out all dissent. Such is the tyranny of the Metascore, where a 75 doesn’t really differ from a 76 but we’re told that they do.

  • At its core, the Metascore is tyrannical, a tool that forces a convergence of opinion. The makers of the site knows this and is making the influence of the Metascore extend even further. The new format of removes the list of all reviews from the main page and features the Metascore more prominently. What happens then is that someone going on the site will see the score and walk away, further diminishing the influence of the individual reviewer. Step by step, the Metascore is asserting its legitimacy over the reviewers that make up the score, and the Metascore already is heavily influenced by the opinions of the few. Not wanting to anger any potential viewers, the start-up reviewer will look at the Metascore and copies the consensus, trapped in a catch-22 where he’ll lose viewers if he follows consensus and he’ll lose viewers if he has no opinions that stands out. A vicious cycle where reviewers lose identity and relevance by conforming to the System occurs.

    And even when a game is artistic enough to stir real division in the reviewers, the Metascore abstracts that to a single average number. Look once again at a game like Space Giraffe, but don’t stop at its Metascore of 68. Look closer, and you see that the range of reviews go from a 100 to a 20. How refreshing is that in a world where reviewers all sound the same? You see that some audiences will love it, while some audiences will hate it, just like a good piece of art.

    But if you don’t look close enough, you might assume that reviewers all gave it scores in the sixties. Such is the false reality of the Metascore.



    About this entry