Wednesday, July 30, 2014

Metacritic's imperfect math


It's finally happened.

A movie on Metacritic has reached triple digits.

I'm sure Boyhood is not the only movie on Metactric with a perfect score of 100, and it's not likely that it will always have this score. But for at least long enough for me to take this screen shot, it had reached those ludicrous heights of critical adoration, the likes of which I have never seen when surfing to the site.

Except, it's totally bogus.

I talked to a friend, who has already seen the movie (jealous) and loves it, about the improbable 100 scored by Richard Linklater's universally acclaimed masterpiece. And he was the first one that got me doubting its authenticity.

In an email earlier this week to me he said "How could it be a 99 a week ago? Doesn't that mean that there was 1 mixed review at least? Did they discard the mixed review? Did the reviewer clarify the review, reclassifying it as a positive?"

Good questions.

So I looked at the numbers offered up on Boyhood's main landing page, and indeed, they were all 100s except for one glaring exception: a 75 (from Slant Magazine's Ed Gonzalez) gumming up the whole works.

But with only a single dissenter out of 40, that 75 might still not be enough to reduce the film's average below 100. So I did some quick math. Assuming all the other reviews were a 100, and Ed Gonzalez' review was a 75, that would be 3975 divided by 40, which equals a 99.375 Metascore. Rounding down, as you are compelled to do in this situation, you get a 99. Ed Gonzalez himself is able to keep this movie from having a perfect score.

But tonight I delved in a little deeper. On Metactric, you can go past the landing page and see all the reviews and all the scores that are currently being tabulated for this movie. And it was here that I found that the movie did not have 39 reviews of 100, but rather, "only" 32. That is still jaw-droppingly astonishing, but it also meant that even a Metascore of 99 was likely too high for Boyhood.

It turns out Boyhood also received scores of 95, 90, 90, 90, 88, 83 and 80, in addition to good old Ed Gonzalez' decidedly contrarian 75. Some more quick math: 3891 divided by 40 = 97.275. So not only does Boyhood not deserve a 100 or a 99, it doesn't deserve a 98 either. It deserves a 97.

Why the fuzzy math, Metracritic?

Scanning Metacritic's FAQ, I think I may have my answer in this dismissively pithy response:

Can you tell me how each of the different critics are weighted in your formula?

Absolutely not.
Simply put, not all critics are equal on Metacritic. And some -- many, in fact -- don't get any love at all. Take Armond White. He's a pretty prominent critic, albeit one whose contrarian views almost destroy his ability to be taken seriously. True to form, he has a negative take on Boyhood, an excerpt of which was recently posted in a group I frequent on Facebook:

"“Hipster Patriarchy” might be a better title for Richard Linklater’s Boyhood. Depicting a white American male from childhood to adolescence, it celebrates the emblematic figure of American social power."

Needless to say, White was not one of the 40 critics who passed muster for Metacritic.

I visit Metacritic almost daily and prefer it to its most similar rival, Rotten Tomatoes. But some of the opaqueness I'm seeing in the FAQ, particularly the flippant way it is expressed, makes me see Metacritic a bit more as something elitist and capricious than something open and honest. Why can't we know you're formula? What's so great about your formula, anyway?

There's some kind of snobbery or something going on here, a form of preferential treatment for some critics over others that requires the whole thing to be shrouded in mystery. Metacritic's role in taste-making, therefore, seems sneaky and underhanded, something for them to be ashamed of. Only if you are a certain kind of critic espousing a certain kind of view can your opinion carry its full value in our system. Seems more than a little bit fishy.

Consider the tone of this part of the FAQ:

Why don't you have 97 reviews for every movie like those other websites do?


Several other websites that provide links to movie reviews have weighed the quantity vs. quality issue and come out in favor of quantity. These sites typically include links to as many reviews as there are available on the net. And lately, with every Joe Schmo posting a movie review both before and after movie releases, there are quite a few reviews for each movie (we're talking 100's of reviews for the more popular titles). True, some of these Joe Schmos--or at least the Harry Knowles--do have quality sites with useful reviews and information. But the quality of many is inconsistent at best. In addition, there is such a thing as too much information, and statistically, once we include a certain number of reviews in our calculations, adding additional reviews will not change the overall METASCORE much in one direction or another.
It's funny that I can essentially agree with what they're saying here and still be appalled that they're saying it. The Joe Schmos they refer to -- and if they are so elitist, you'd think they'd opt for the spelling Joe Schmoe rather than Joe Schmo, in part because it pluralizes better -- are actually one of the groups most responsible for me no longer having the ability to achieve gainful employment as a critic. But I don't begrudge them their right to do what they're doing -- not out loud, anyway. It's the out loud snobbery of statements like the one above that I find kind of shocking.

The above bit seems to be a direct dig at Rotten Tomatoes, but I'm starting to wonder if Rotten Tomatoes isn't more likable than its snooty rival. Rotten Tomatoes does indeed have 156 critic reviews that "count" for Boyhood, only two of which are rotten. In a system based on actual math, and using simple thumbs up-thumbs down metrics, those two negative reviews still leave Boyhood shy of that perfect 100. It carries a 99 freshness rating. (Interestingly, neither of the negative reviews belongs to Armond White. The two lonely contrarians in this case are a woman named Rebecca Cusey and a guy named Matt Pais.)

One assumes that Metacritic's 40 hallowed critics must be chuffed (to use an Australian term) that Metacritic considers them useful members of the critical community. But at the same time, eight of them must feel like their opinions don't mean a squirt of piss. Because there's certainly one thing that a perfect score draws attention to: the insignificance of the opinions of the naysayers. If a movie has a 93 or a 96 or something close to perfection, people aren't running any numbers to determine what Metacritic is actually saying. But if Metacritic says something has a perfect score and there are eight people -- that we know of -- that don't think it's perfect, one has to wonder why they even bother to post the reviews of those people. I mean, this is all about displaying only the content that conforms to Metacritic's preconceived notion of itself in the first place, right?

I'm starting to wonder if I don't have the dumbest of reasons for preferring Metacritic over Rotten Tomatoes, which is also a factor in why I like going to Mobil more than I like frequenting other gas stations. Simply put, I like Metacritic's fonts. I find its design aesthetically pleasing, and I do not find the Rotten Tomatoes design pleasing. In fact, I'd go so far as to say that Rotten Tomatoes looks cheap. Then again, maybe that makes sense -- a crisp, clean appearance for a snobby website, and something less fussy for an egalitarian democracy like RT.

As I write this post and find more and more things about Metacritic I don't like, I'm starting to view it as the kind of secretive organization that Kirby Dick railed against in This Film Is Not Yet Rated when he excoriated the MPAA. The only difference is that Metacritic is flippant about its secretive decisions, almost rubbing our noses in them, while the MPAA is just plain secretive. Metacritic, while being superficially more open by admitting its own secrecy, actually comes off worse.

Of course, none of this has any impact on my anticipation for seeing Boyhood, which, regardless of source, figures to land somewhere between a simply great film and The Best Film I Have Ever Seen. All I know for sure is that it's probably not perfect, and I don't know why Metacritic is so hell bent on telling us it is -- especially when they refuse to even tell us what their logic is in reaching that conclusion.

You'd think after writing all this I'll be ready to delete Metacritic from my favorites, but you'd be wrong. Fact is, there's still something about its snooty judgments that I crave. It speaks to my own inner snob, I guess.

Or, maybe I just like the fonts.

2 comments:

Don Handsome said...

Well if it comes down to the black box of Metacritic or the tacky graphics and fonts of Rotten Tomatoes, I will take Metacritic every time. Metacritic has a more authoritative name as well, whereas Rotten Tomatoes doesn't even sound legitimate. This is why I visit Metacritic often and don't visit Rotten Tomatoes at all.

But both do the same thing for me...they both serve as review aggregation devices, putting a plethora of reviews in one place for me to read. I don't care what the number is or how "fresh" a flick is, I just want to soak up certain movies and experience them as a cultural piece.

But not everyone uses the sites this way and those scores DO mean something. Both get sited on numerous podcasts that I listen to as indicators critical response and therefor these scores are entering the lexicon as final indications of something real about a film. The math SHOULD be really simple and straightforward and therefor these reports should indicate a real and comparable stat that works as a talking point, but when you hide your formula and you hide whatever math you're using to make the numbers work, how do we know you also aren't hiding the real opinions?

Thanks for this explanation and for your investigation. This is hard hitting investigative film journalism at its best.

(PS. I've given Boyhood my worst rating yet...seven thumbs up)

Derek Armstrong said...

I don't know about "hard hitting." Part of the reason I write a blog, instead of being a journalist anymore, is that it relieves me from having to check all my facts, and allows me to make unsupported suppostions. But I still do like investigating things a little bit, as long as I am completely free from having to account for my research practices.

Yeah, I'll stick with Metacritic. I actually feel kind of dirty being on Rotten Tomatoes -- it's just so unpolished. However, I do think there are certain things it does right.

You raise an excellent point about how these scores have far-reaching effects on which films get seen, and ultimately, made. If anything cries out for an open process more, I don't know what it is.

When oh when oh WHEN does Boyhood open here?