Thursday, January 21, 2016
The abuse of the near-perfect score
Time for another rumination on the ways I'm struggling with correctly applying star ratings to movies.
In 2015, I successfully clamped down on throwing a perfect score to very many films. Out of 304 new-to-me movies I saw in 2015, I gave only four of them a five-star rating on Letterboxd. That's only one-tenth of one percent of the movies I saw last year.
Four-point-five stars? That's another story. I did that 34 times. That's more than 11% of all the movies I saw.
And I did see some good movies in 2015, no question about that. But were they just good, or were they really, truly great? And what is the cutoff between good and great in a star rating?
It's that same old struggle, the one that always makes me question again the validity of star ratings -- or certainly their objectivity. Their transferability.
And even though I've been trying to be less wanton with my 4.5-star ratings, a problem I've been conscious of especially during the second half of the year, I'm not off to a good start in 2016. I've seen 33 movies so far in 2016 (I know that's a lot, but January is always a busy month) and a full seven of them have received 4.5 stars. Or 21%.
It's hard to figure this out when I have people in my life who tend toward either extreme.
For one there's my editor at ReelGood, who is notably stingy with his five-star ratings (which must be doubled for ReelGood, where the ratings scale is 1 to 10). He didn't give the maximum 10/10 to any movie he reviewed in 2015, though he did give the rating twice in 2014. Not only that, but he'd tease me about each new 9 I submitted. It just so happens that I reviewed many of my favorite films of last year, resulting in seven scores of 9 out of 37 reviews I wrote, or nearly 19%. But then when you include the two perfect scores I gave, that's nine reviews of 9 or higher, or nearly a quarter of all my reviews. When you include the five reviews where I gave an 8, that's 14 of 37 with an 8 or higher, or nearly 38%. And with two of those, I wanted to give a 9 but shied away from it to maintain the sense that I wasn't just a movie-loving loon.
So I've been overpraising movies, right? Erring too much on the side of film optimism, right?
Hold on there. I just read the top 15 of 2015 of a guy in my Flickcharter discussion group, and he gave five stars on Letterboxd to ALL FIFTEEN films on the list. His rationale was also convincing: "I don't use half stars in rating; I give five stars to anything I'd be happy to say could be/is in the top 10% of movies I've seen, and each honestly makes that cut."
Wow, just imagine how tied into knots I'd be if I cut out the half stars. Making me choose between three stars and four? I just couldn't do it.
So that leaves me in the same spot I always find myself whenever I force myself into one of these exercises of self-examination: no closer to a perfect answer. I clearly want to recognize films for going above and beyond but not being perfect. Some people might give those movies four stars. I guess I've been giving them 4.5. But it does create less overall margin for error. It does create fewer spaces in which to express the subtle differences in the gradations in greatness between two separate movies.
I guess I'll just continue going with my gut, and trying not to think about it too much.
Subscribe to:
Post Comments (Atom)
2 comments:
That's precisely why (as we discussed over on Letterboxd) I've ended up going with Flickchart-recommended star ratings over the past year or so. Ranking a movie lets me more adequately put it into its "rightful" place and determine right away if it IS in the top 10% of movies I've seen. Very occasionally I will suspect it's been ranked too high or too low thanks to a misplaced movie and adjust the star rating, but more and more I've said, "Ya know what? Flickchart says it's a 2-star movie, it's a 2-star movie." Decision. Made.
Letting Flickchart choose my star ratings has actually helped me solidify a little bit more what those stars mean to me. I'm learning, for example, that it's really common for me to give out 2s to technically good movies I was bored by. It's been an interesting process, and once that gels a bit in my mind I might return to assigning stars myself, but for now I'm pretty happy to let the chart decide.
It's a great idea -- except for the fact that I invert that direction of influence. I use star ratings on Letterboxd for my gut instinct impression, and Flickchart for my long-term, considered impression (as exemplified by my rule of not adding a film to Flickchart until it's been at least 30 days since I've seen it). I guess that has to do with the star ratings of some movies not impacting the star ratings of others, where placement on Flickchart clearly does influence which movies go where.
However, my star ratings on Letterboxd *do* influence how I hand out ratings to other films -- I have noted times in the past when I didn't give out a certain high rating to a certain movie simply because I'd given out too many high star ratings during that period of time. It's believing in the fallacy that ratings should be sprinkled more or less throughout the spectrum of available ratings during any given period, when in fact there are times when we are simply exposing ourselves to higher quality stuff.
Interesting discussion as always!
Post a Comment