Rotten Tomatoes is eating your brain

Man do I love film criticism. Not because they actually do much by way of recommendation. Most of the time I’ve made up my mind on whether or not I want to see a movie by the end of the first trailer. I love film criticism because I love film. I love to watch, discuss, dissect and also criticize, sometimes harshly. Criticism helps me with all that.

Film criticism helps us explore the themes and messages propagated by what is quite possibly the most popular and financially successful art form in history. It helps us understand what we are watching and what lessons we are absorbing.”Transformers: Age of Extinction” has made $751,450,231.000 worldwide since it’s release last month. I don’t know how many ticket sales that actually translates to, but it seems like a lot. Something that that many people are consuming deserves to be taken seriously, even if it’s stupid.

I do not, however, love all film criticism equally. Rob Gonsalves, for example, preferred “Gattaca” to “The Truman Show.” In fact, he thought Truman was “amorphous” in a bad way. That’s fine, because he is certainly entitled to his opinion. His review is reasonable enough; he wanted something edgier and more concrete. The main reason Gonsalves’ review doesn’t bother me is because when it all comes down to it, I really don’t care what Rob Gonsalves — who blogs at EFilmcritic.com — thinks about this, or any other movie. And why would I? Why would he care about what I have to say about the Truman show? I doubt he does.

But there is someone who cares deeply about Mr. Gonsalves’ opinion of “The Truman Show,” despite my indifference. It’s a little website called Rotten Tomatoes.

You see, Rotten Tomatoes is billed as a “review aggregator,” meaning its purpose is to compile reviews for movies (and now, increasingly, television shows) to make it easier for moviegoers to get a sense of what’s being well received. That way, one no longer has to sift through all those pesky reviews to decide if a movie sounds worth it. Now all that can be done with a simple glance at the Tomatoemeter. The trouble is that, in general, RT doesn’t do a very good job of this. And worse, it’s actually doing film criticism a disservice.

Each movie is deemed either “Rotten” or “Fresh” based on the ratio of positive to negative reviews each film receives. To earn a “Fresh” rating, at least 60 percent of the reviewers must give the film their stamp of approval. It really doesn’t seem all that bad of an approach. But it is. Here’s why (according to me).

1. Who they choose to aggregate seems silly and almost arbitrary.  

Now, this is a tricky criticism, because Rotten Tomatoes does indeed have guidelines as to who is included within their aggregation. In fact, here they are:

Screen Shot 2014-07-14 at 10.57.01 PM

There isn’t much to complain about here (I’ve only included their “online publications section, go here for the full deal on other publications) but I must say that from my experience these guidelines must not be strict enough. Like my friend I mentioned before, Rob Gonsalves at EFilmCritic, I see many names, websites and even publications that strike me as odd to include next to a review by Bilge Ebiri, who just might be the best film critic around since Roger Ebert died (if you haven’t read his essay on Adam Sandler from last May, please do). Now that may sound elitist, because it kind of is. It takes me a while to warm up to a critic. I have to learn to trust them. That’s what disturbs me so much about RT’s aggregation approach, it implies that I, personally, should be giving equal weight to critics I don’t know or care anything about. When the Tomatoemeter says a film has an RT score of 96 percent, or 43 percent, you must wonder “according to whom?”

That’s why I’ve always thought RT should develop some sort of customization system. They already have the option to isolate the score to include only “Top Critics,” but that doesn’t help much, since I also don’t care what Peter Travers has to say about anything ever. They do have a “My Critics” option, but that works by taking some silly little quiz where you rate certain movies and then they tell you what critics you are likely to agree with. That also doesn’t work, because I don’t typically agree with Wesley Morris, but I’m still always interested in his opinion, and I value it. If I could simply select which critics I wanted to include in my RT score, this section of complaint wouldn’t be necessary. So get on it Rotten Tomatoes.

2. If you must aggregate, do so in a manner that makes sense. 

I will admit that I see the value in aggregating reviews. Sometimes you might be indifferent to a movie, but once you notice that critics like it more than you anticipated you change your mind and fork over $9.50 to see it on the big screen. I’ve done it. You’ve done it. We’ve all done it.

But the problem with Rotten Tomatoes is that their aggregation system is so silly that it can send false messages to the questioning moviegoer. Not only do they include strange reviewers in their count, but they also only have two options for each reviewer to declare: Either they liked it or they didn’t. Now, anyone who frequents the Life & Style or Entertainment section of their local newspaper knows, reviews typically don’t work that way. Many films are simply just ok. Others are great, but the ending simply doesn’t cut it, or it just wasn’t quite as good as the reviewer had hoped and so the review takes a negative tone, but is peppered with phrases like “moments of brilliance.”

For that reason, a superior aggregation site called Metacritic doesn’t limit reviews to just Hot or Cold. They’ve created space for Luke Warm as well. And it makes a huge difference.

Take, for example, my favorite unnecessary punching bag “The Avengers.” According to Rotten Tomatoes, Avengers has an approval rating (if we want to call it that) of 91 percent. That’s pretty amazing that 91 percent of those who reviewed a franchise superhero film thought it was a great movie. The problem is that’s not entirely true. Metacritic assigns “The Avengers” a score of 69 (they don’t label it with a percent sign). So why the difference? Well, first of all Metacritic only included 43 reviews, and Rotten Tomatoes counted 301 reviews as worthy of your attention. That might sound like the RT score is more impressive, but what it really means is that literally hundreds of fanboys (no hate for fanboys here, just making a point) got their personal review of “MARVEL’S THE AVENGERS IN 3D” included in the aggregate. When we knock it down to just “Top Critics” the score drops to a still respectable 84 percent, with a reviewer range much closer (49 reviews) to that of Metacritic.

So then what explains the the remaining 15 percent? Well, Metacritic astutely points out that 10 of the 43 reviews for “The Avengers” were luke warm at best. It also accepts the fact that even the positive reviews differ in enthusiasm. They’ve adjusted their aggregate to account for such reviews, and that drags down the score.  I think the best example of this disparity is how each aggregate site ranked Roger Ebert’s review of the film. While Rotten Tomatoes simply lumped him in with those who liked the film, Metacritic assigned his review a score of 75, meaning he basically gave the film a C+. A passing grade, sure, but nothing terribly special to behold. Though Ebert lauds the film for being entertaining and technically impressive he ends his review by saying “‘The Avengers’ is done well by Joss Whedon, with style and energy. It provides its fans with exactly what they desire. Whether it is exactly what they deserve is arguable.” He also weakens his positivity by saying things like “These films are all more or less similar, and ‘The Avengers’ gives us much, much more of the same.” Suffice it so say, claiming that The Great RE simply gave it a “Fresh” rating is shortsighted.

Of course, this also works in reverse. Certain movies that Rotten Tomatoes deems “Rotten” may have actually been more warmly received. Which brings me to my next strange point.

3. People actually seem to be caring about Rotten Tomatoes scores, and that isn’t good.

It would be one thing if it was all fun and games at the Tomatoemeter, but I think there is (minor) proof that this thing is actually impacting how we perceive the quality of the movies we watch. For some strange reason, I took the time to chart the Rotten Tomatoes scores of all Academy Award nominees for Best Picture since the year 2000, and I think there is a (minor) case to be made that the Academy actually seems to care about Rotten Tomatoes scores, so hear me out.

 

Screen Shot 2014-07-06 at 12.25.29 AM

 

You’ll notice a few interesting things from the chart above, one of which being that it doesn’t actually look the popularization of Rotten Tomatoes has affected much, since there are plenty of RT scores below 70 percent after 2006 (Rotten Tomatoes was actually founded in 1998, but it really started gaining steam around 2004 and basically became a force to be reckoned with by 2006). Well, there are a couple of interesting things I want to point out.

Firstly, Prior to 2009, only 5 movies were nominated for best picture. If you look at all the films prior to 2009, the films nominated ranged anywhere on the Tomatoemeter from  60 or 70 percent to up in the shallow 90s. The films were few but the range was wide. Take a look at 2009 though, the year after The Academy started nominating 10 (10!) films for best picture. Of those 10 films (it looks like there is only 9 because if two films have the same RT score there is only dot to indicate both films) only one dipped below 90 percent — and that honor goes to “The Blind Side,” which pulled in a slick 66 percent. One may assume that had The Academy only nominated 5 2009, all 5 would have been above the 90 percent mark.

The same goes for the rest of the years’ nominations up to 2013. There are films that break below the 90 percent mark, but it is usually 1 or 2, maybe 3 films at most. If the Academy was only nominating 5 like they did in years prior, it’s likely that few if any below 90 percent would even make the cut. In fact, in 2010, every film nominated — and again there were double the films nominated that year than there were in 2000 — was ranked at a 90 percent or higher on Rotten Tomatoes (at least as of this writing. RT allows reviewers to keep submitting so the ratings can fluctuate, which is also bad for history).

The trend is even starker when you only consider winners.”A Beautiful Mind,” for example won the award for best picture in 2001, with an RT score of 76 percent. That also happens to be the same score of “Gladiator,” which won the year prior. From 2006 on no film below 92 percent won the Academy Award for best picture.

So there you have it. There may be a box-office correlation as well, I’m not sure. And I think it’s also important to point out that this isn’t necessarily a bad thing. There have been many films in the past (*cough* *”Crash”*) that movie-goers seem to think the Academy royally flubbed on. Maybe Rotten Tomatoes has made it easier for members to vote for films they know they won’t regret. On the other hand, it gives film awards and criticism a strange air of objectivity, which is absurd. When “The Artist” won in 2011, it did so with a 98 percent “Fresh” rating (or at least it has one now) which is a striking number. No one else that year has a 98 percent rating ( the closest are “Hugo” and “Moneyball” at 94 percent each)  and so it can be easy to see “The Artist” as some sort of clear winner who objectively was better than the others. That’s absurd.

4. Critics are wrong, a lot.

My final beef with the reliance on Rotten Tomatoes is that with it’s strange air of objectivity, it convinces people that one or two critics may be wrong, but there is truth in consensus. When it comes to movies, that is false. Many movies were not critically well received when they came out, only to be recognized as an overlooked masterpiece later on. Rotten Tomatoes makes it easier for critics and viewers to double-down on their opinions. Reassessment is likely going to get harder.

One good example of this is “Fight Club.” Fight Club was know at the time of its release a critical mixed bag and a box-office misfire. Critics and audiences just didn’t seem to know what to do with it (there are even stories of Paul Thomas Anderson walking out of a screening because he wasn’t taken by the levity with which the film addressed cancer). Now, in 2014, the film sits at an impressive 80 percent on the Tomatoemeter. However, when you view only “Top Critics,” “Fight Club” has a score of  only 64 percent. The inclusion of movie bloggers who have flooded Rotten Tomatoes with positive review retrospectives has allowed the film to be reevaluated. It’s cult status has been legitimized through the Tomatoemeter. But can that continue to happen? If a film scores a paltry 64 percent in this day and age, is a cult following even likely to form? I’ve read more than one essay claiming that films like “John Carter” and “The Lone Ranger” got a bad rap and, though imperfect, deserve a reassessment. But can a movie with a 51 percent on Rotten Tomatoes really expect a reassessment? Can a film like “Labor Day,” which I thought  was near perfect until its disastrously corny conclusion, ever get a second look? Not likely, because as soon as I recommend it to someone, the first thing they will likely do is look it up on Rotten Tomatoes, and there’s something scary about a 33 percent.

There are plenty of other movies that I think are victims to The Meter. One of my favorite comedy’s, “Oscar” starring Sylvester Stalone and directed by John Landis  (yes that John Landis) has a 13 percent on RT. I laugh so hard every time I watch that movie. Every time. So hard. But convincing someone else to watch is now that much harder.

Thanks Rotten Tomatoes.

So I hope we can all stop caring so much about that ominous score. Film reviews are a wonderful thing, and they are meant to be read and enjoyed. Films are complicated pieces of art and the individual voice can mean so much more than the collective, if it’s a voice you trust.

Advertisements

Published by

JJ Feinauer

JJ writes stuff occasionally.

2 thoughts on “Rotten Tomatoes is eating your brain”

  1. This is a really good analysis. I shamefully admit that I rely on a RT score sometimes when it comes to deciding whether or not to see a movie. At least I actually click and read a lot of the reviews, though. It’s sad that some people have become too lazy to read full reviews, and instead, heavily rely on a subjective aggregated score. Movies deserve more than that.

    1. Yeah, I think it’s the temptation that makes RT so devilish to me. It’s actually quite handy to have all those reviews compiled into one place, but the allure of ignoring them in favor of some overall score makes it feel nasty.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s