RSS

Hello and welcome to the movie blog of author John DeFrank - FilmZ and Guy Sobriquet Malone - Researcher

Truly Rotten: The Case for Ignoring Rotten Tomatoes


Aggregate Site is an Aggravating Sight, Part 1 by Guy S. Malone, Researcher

Remember Roger Ebert and Gene Siskel?  Man, those were the days.  They are remembered for the thumbs up/thumbs down on their movie reviews.  But in reality, they offered so much more: they explained their conclusions, adding context that not only gave us insight into the movies they were discussing, they gave us insight into film--the art and the craft.  Their disagreements, though sometimes bitter, led us to understand that, in the end, the appreciation of a film is a subjective, individual choice, it was up to us, the viewers, to make that judgment, and whatever that judgment turned out to be, it was all right.  Above all, they understood that the prime purpose of any film critic should be to serve the moviegoing public.  Yes, those were the days.

Since then we have moved on from clicking the TV remote to clicking on Internet sites for quick stats and sound bites, and too many of us allow sites like Rotten Tomatoes to make the decision for us.  You know Rotten Tomatoes, RT to many, with it's famous "Tomatometer"--the percentage score given to a movie, based on an aggregate of contributing critics.  RT's beginnings were as a cute, fun novelty site with a catchy gimmick: a movie is rated "Fresh" (symbolized by a ripe red tomato) or "Rotten" (an ugly green splash).  In recent years, though, it has grown into a monster whose Tomatometer can make or break movies, earning or costing studios and production companies tens of millions of dollars.   

It's no secret that Americans love movies, and RT was at the right place at the right time to capture the zeitgeist, blossoming into a monster entity.  It captured the reviews of top critics, but that wasn't enough for RT; it wanted masses of critics.  The thought was, if we can get consensus from hundreds of reviews, we can make the decisions for moviegoers so they know which films to spend or save their money on.  But does RT truly inform viewers?

Who makes up the collective voice of the Tomatometer?  Some are established, recognized film critics.  But the masses that contribute to RT, especially for big-budget blockbusters are enterprising Internet bloggers.  Some are excellent writers (some have a mere modicum of talent) some have formal education in film (many don't), and some love film (some are prejudicial for or against certain movies, actors, directors, etc.).  Some even have that purity of purpose, but a disheartening number do not; either that or they don't understand the cardinal rule, worth repeating here: the prime purpose of a film critic is to serve the movie-going public.  So it is that a fearsome number of new wave "critics" has sprung out of the ground like Ray Harryhausen's skeleton army from Jason and the Argonauts to lay low any movie, actor, or director that doesn't fit into their worldview, or, conversely, to extol the virtues of those that do.  Those goals intertwine with personal needs of some (many?) critics to pander to kindred spirits, to raise the almighty click count (visitors to their blogs), and thus prioritize egocentric, self-aggrandizing, and prejudicial needs over integrity of their mission.

These negative traits may not be readily evident to a Rotten Tomatoes staff as it vets potential critics.  Still, if the Internet has shown us anything, it is that charlatans abound.  Also, it stands to reason that there are only so many qualified film experts available to judge a film with integrity.  And finally, given the endangered state of journalism these days, even a film critic of the highest integrity has to be cognizant of popularity and revenue as embodied in the precious click count, and so he or she had better think twice before lambasting the latest Star Wars film (see below)--a franchise boasting arguably the largest, most rabid fanbase in all of filmdom.  Therefore, in an effort to amass a critical mass of critics (sorry, I had to do it) with the goal of most accurately assessing films,  RT has ironically had to lower the bar to achieve their desired numbers while at the same time the integrity of established critics is increasingly challenged by the need to maintain their following. 

More damning is Rotten Tomatoes' deeply flawed statistical model that leads to misleading conclusions.  RT has become the go-to film site, based on the beliefs that sheer numbers of critic input provide reliability and validity and that a simple percentage provides ready understanding.  Conclusions drawn from a Rotten Tomatoes' aggregate of opinions, however, are questionable, at best.  Opinions, by definition, lack objectivity. Add to that the RT number crunching, which seems to go out of its way to undermine validity and reliability.  First, RT converts each critical review to a percentage, and then it uses arbitrary cut-off points, with everything 60 or above "Fresh" and 59 and below "Rotten."  Second, it then uses percentages of percentages (a statistical no-no), combining ratings into a misleading aggregate, giving the illusion of consensus that skews toward the groupthink described above.  Third, unless readers take the time to, you know, read individual reviews, they would glean no substance or context out of a given score. 

A scientific way to examine RT scores that proves its fallacious metric: plot out the scores of every film RT has assessed on a scatter graph; if it were a valid and reliable system, the plotted points should reveal a bell curve, with the vast majority of movies falling in the 40 to 60 range and very few landing in the 0 to 20 and the 80 to 100 range.  A cynic might argue that the RT spread might look like a cup curve.  

Better still, let's bring it to layman's terms.  Let's say that a given movie has a total of ten (10) reviews; eight (8) are "Fresh" and two (2) are "Rotten" reviews.  By RT calculations, that adds up to a Tomatometer score of 80% (so each Fresh score might as well be a 100% and each Rotten one might as well be 0%).  Let's say, though, that we look at individual reviewers ratings for the same movie and each of the 8 Fresh reviews is a 60 (8 x 60 = 480) and each of the 2 Rotten reviews is a zero (2 x 0 = 0).  So, if you add the total individual ratings for all ten critics and divide by the number of critics, you would have (480 + 0 = 480, divided by 10 = 48%.  So, for this hypothetical film, the Tomatometer reads that it is 80% Fresh when, in fact, the average of reviewers scores is 48%.  

Remember, these statistical flaws are exacerbated by pandering of many RT reviewers to the mass popular opinions of fanboys, hipsters, haters, and trolls that anonymously express their prejudices in comments sections, film forums, and social media sites.  Add to that the misogyny that is evident in the heavily male-dominated critical field.  What we end up with is a groupthink among wide swaths of critic-bloggers whose relationships with the special interest groups that feed their sites are symbiotic--one feeding the other and giving both strength.  This groupthink submits them to the basest urges: the drive to make a name for themselves, the fear of being an outlier, and the drive to fit in with the "cool kids"--and it leads to overpraising some actors, films, and directors and dealing vicious hit-jobs on others.  Anyone who observes RT and popular social media over time can see that some movies, stars, and directors get a pass while others (particularly females) are excoriated.

Finally, as implied above, there is the matter of ethical interpretation of individual critical reviews into "Fresh" or "Rotten" categories.  Two years ago, John (the guy whose name is on this website) read a review by Eric Kohn of IndieWire in which he gave a film a "B-" grade.  Shortly thereafter, by sheer chance, John was checking Rotten Tomatoes about the same film, and Kohn's review was interpreted as "Rotten."  John wrote to RT, questioning their interpretation and received this email from Flixster Support on December 08, 2015:

Hi John,
Most critics from the Online Film Critics Society (OFCS) enter their own quotes and ratings.  We contacted Eric Kohn regarding the review and were told that the Rotten rating was intentional.  Looking through his review history, a B- can be either Rotten or Fresh.
Regards,
Jose

If you are shaking your head in incredulity, flipping your lips, and saying WTH?, so was John.  What kind of standard has no standards?  As it turns out, we don't believe that Eric Kohn (a respected critic) was ever contacted because several days later, his review of the film was changed from Rotten to Fresh.  Making matters worse, the film in question was one that many hipsters and haters united against in a trolling campaign during the early period of the film's wide release.  Worse still, this is not an isolated incident; we have spoken to a number of others who have witnessed similar (let's call them) "inconsistent" interpretations of critical reviews. 

We end part one of this essay by looking at one of the biggest blockbusters of recent years, Star Wars: The Force Awakens, a smashing success by any measure except, arguably, artistic.  Rotten Tomatoes has recognized 367 critical reviews that make up its aggregate score of 92%.  That bears repeating: three hundred and sixty seven reviews!  Are we to believe that there are that many qualified reviewers out there?  And of that total, only 28 gave SW:TFA a negative review, so if that many critics agree, and this is their collective opinion, it must be one of the greatest films in cinematic history.  And, indeed, Rotten Tomatoes list of the 100 Top Movies of All Time does have The Force Awakens with an adjusted score (don't ask) at number 68, ahead of Gone With the Wind (73), The Good, the Bad, and the Ugly (74), On the Waterfront (90), The Godfather, Part II (92), Jaws (96), and Lawrence of Arabia (100).  And The Force Awakens is a piker compared to Mad Max: Fury Road, whose lofty perch at number 5 all time places it above all of the aforementioned films, plus The Godfather (9), ET: The Extraterrestrial (11), Singing in the Rain (14), Casablanca (15), ... well, you get the idea.  

The common denominator: Mad Max: FR had 362 reviews; SW:TFA had 367; the ten classics we mentioned totaled 754 reviews (a 75 review per film average).  One would assume the reviews for the classic films came from established, respected print journalists, and one could extrapolate that the hundreds of extra reviews enjoyed by The Force Awakens and Mad Max: Fury Road came from the teeming masses of RT's bargain basement reviewers and that these are the culpable parties that bloated the ratings those two films enjoyed.

After all, there are positive reviews and there are positive reviews, and there are cumulative scores, and ... well, you get the idea.  First, let's discount a significant percentage of the Rotten Tomatoes contributing critics who have no more insight, experience, or education than the guy sitting next to you at the megaplex.  Instead, look at comments from some of the positive reviews of recognized top critics that made up at 92%.  Michael Phillips from the Chicago Tribune "[I]t is good.  Not great.  But far better than 'not bad.'"  Amy Nicholson of the Village Voice wrote: "It's not a Bible; it's a bantamweight blast.  And that's as it should be: a good movie, nothing more."

We realize there is a huge swath of humanity that holds the entire Star Wars oeuvre in such high esteem that it is unassailable.  We personally believe that Star Wars: The Force Awakens is a rehash of Episode Four: A New Hope and a bad one at that.  Therefore, we probably don't see eye to eye on SW:TFA, so we'll leave it at that.

Until later,

Guy S. Malone, Researcher

Stay tuned for Part Two of this essay: 
Where is Pauline Kael now that we need her, and if they can resurrect Star Wars why can't we resurrect her?

   

0 comments:

Post a Comment

 
Free WordPress Themes Presented by EZwpthemes.
Bloggerized by Miss Dothy