Summary

  • Rotten Tomatoes has always had flaws, and recent news only highlights its broken review system. The platform's data is unreliable and the formula is fundamentally flawed.
  • The scoring system of Rotten Tomatoes is heavily biased due to genre bias, selection bias, time bias, and recency bias. The reviews lack parity and context, making them untrustworthy.
  • Rotten Tomatoes' audience scoring system was completely broken in 2010, with a flood of reviews drastically changing scores. The system is prone to manipulation and review bombing, making it unreliable as a metric.

Recent news about Rotten Tomatoes has cast doubt on the validity of the review platform, although a closer look at Rotten Tomatoes' history and data makes it clear the Tomatometer was always broken. While Rotten Tomatoes' review system has become nearly synonymous with film criticism, the system has always been seriously flawed, and recent revelations make it even clearer why it shouldn't have been relied on so heavily as a metric in film analysis or discourse in the first place.

Rotten Tomatoes scores are used in movie marketing and even prominently displayed alongside Google searches for movie titles, listed in the info boxes for digital purchases, and printed on packaging for home media. Despite the ubiquity of Rotten Tomatoes as a quality metric, there's always been numerous issues with the platform's data, how it's gathered, how it's applied, and how it's interpreted; however, even if we accept the validity of the data itself, it's clear the entire Rotten Tomatoes formula is still broken anyway.

Related: Citizen Kane Shows Just How Broken Rotten Tomatoes Is (Can It Be Fixed?)

5 Rotten Tomatoes Data is Heavily Biased (But Not in the Way You Think)

100% audience and critics scores on rotten tomatoes

One of the biggest issues with Rotten Tomatoes is bias, but not a bias on the part of the reviewer. Genre bias, selection bias, and time bias all play a major role in how movies are scored and destroy any sense of parity in the scoring data over time. Different genres, such as horror, have notably different review score behavior than something like a kids' movie. Recency bias is also a major factor as newer movies tend to have higher scores, not only upon initial release, with many movies dropping at least a few points after their score debut on Rotten Tomatoes, but newer movies in general tend to be more favorably reviewed than older movies.

Additionally, when it comes to the historical review data, Rotten Tomatoes aggregates old reviews for movies released prior to the internet and the creation of Rotten Tomatoes, meaning the kind of obscure reviews or smaller publications have no presence in older scores, so those reviews indicate something entirely different due to the time they were released and critic expectations of movies at the era. Additionally, old movies regularly get new reviews added thanks to a remaster or re-release, which further erodes any sense of parity or context to the reviews of movies released in different decades and different cinematic climates.

4 Rotten Tomatoes Broke Its Entire Audience Scoring System in 2010

Rotten tomatoes Sam raimi spiderman

When it comes to historical data, Rotten Tomatoes' audience scores are even worse, as the data suggests the audience scoring system was broken completely in 2010. Using the Wayback Machine, Screen Rant identified numerous popular movies from the 2000s that saw an influx of tens of millions of reviews (possibly due to the acquisition of a Rotten Tomatoes competitor), totally changing the scores of some movies, such as Sam Raimi's Spider-Man and Star Wars: Revenge of the Sith and confounding the already suspect audience data to the point of uselessness.

This discrepancy is only identifiable by checking cached versions of the Rotten Tomatoes pages of individual movies because the total number of audience reviews is no longer visible in newer updates to the Rotten Tomatoes interface. Additional updates such as the "Verified Audience" score aim to fix the audience metric, however, since there's often significant differences between the Verified Audience score and the traditional audience score, and the Verified Score only applies to movies released in the past few years, it totally destroys parity across time, making the audience score virtually meaningless as any sort of comparison point between different movies.

3 Rotten Tomatoes Review Scores Are Prone to Review Bombing and Studio Gaming

She Hulk review bombing

Of course, even if the audience data were "clean," the very nature of the Rotten Tomatoes system is prone to gaming, both for the audience score and the critic score. In the case of the audience score, the potential for manipulation is clear as is seen by review bombs, where a particular audience segment will flood a movie with negative reviews, or brigading, where an audience segment floods a movie with positive reviews. Even if the reviews in those situations are "legitimate" reviews (and not bot-generated), the strong positive sentiment or negative sentiment doesn't represent an actual assessment and just a particular audience agenda.

Traditionally, the gatekeeping of the critic score where Rotten Tomatoes only includes specially approved critics has been pointed to as evidence of its superiority, but the critic Tomatometer score is just as vulnerable to gaming in its own ways as seen by the frequent big drops in score between pre-release reviews and post-release reviews. While this gaming isn't necessarily due to outright false reviews, it's fairly easy for a studio to know which segments of reviewers will see a movie more favorably and use that knowledge to steer the early reviews to a more favorable audience, making Rotten Tomatoes strategy a component of a movie's marketing plan.

While this process can certainly influence results, it also regularly backfires, such when Wonder Woman 1984 notoriously opened to 88 percent and even earned the special "Certified Fresh" seal before dropping 30 points to 58 percent after more reviews came in. More recently, Disney decided to unveil Indiana Jones and the Dial of Destiny to an audience at Cannes, which turned out to be a major tactical failure after the initial reviews gave it a Rotten 52 percent score. It turned out a broader audience was far more favorable than Cannes critics, so several weeks later after more reviews arrived it rose all the way to a Fresh 69 percent.

2 Some Rotten Tomatoes Critics Were Paid For Positive Reviews

The Rotten Tomatoes Logo With Popcorn and a Green Tomato Splat

In a recent report by PR firm was working to do more than simply gaming the Rotten Tomatoes system and actually went so far as to pay some reviewers directly for positive reviews. According to the report, a company named Bunker 15 sought out smaller, obscure critics and paid $50 or more for reviews, in violation of Rotten Tomatoes' policy. According to Variety, some of Bunker 15's communications with writers were fairly explicit about their intent, including emails such as “I would like to know if you don’t post negative reviews on Rotten Tomatoes.”

While the report indicates a fairly narrow application of this strategy to a handful of movies, which Rotten Tomatoes says have been de-listed from the service entirely, the mere existence of paid reviews on the site fundamentally erodes the credibility of the entire site. Bunker 15's actions reportedly only extended to smaller outlets and mostly related to smaller VOD movies, but the news that any scores on the site were subject to paid manipulation creates a narrative that questions the integrity of every other score. Considering the subjective nature of movie reviews and frequent division over Rotten Tomatoes scores, many people are already primed to embrace any evidence undermining Rotten Tomatoes' credibility.

1 Rotten Tomatoes' Scoring System is Fundamentally Broken

Joker Rotten Tomatoes Fire

Despite numerous issues with the integrity of the data populating Rotten Tomatoes' scoring system, even if the review data were entirely legitimate, the formulas at play create an incredibly skewed score favoring mediocre movies. While each individual review has its own associated score, Rotten Tomatoes flattens every review to a simple thumbs up ("Fresh") or thumbs down (Rotten) score, with the ultimate Tomatometer percentage simply representing the percentage of reviews that are Fresh. While that provides a general overview of critic sentiment, it's a slanted formula using math that penalizes a lack of consensus.

Related: Joker Proves Rotten Tomatoes is Biased Toward Mediocre Movies

The way the math works, if ten critics gave a movie a 6/10 score, it would receive the exact same 100 percent on the Tomatometer as if all ten critics gave it a 10/10; however, if nine critics gave the movie a 10/10 and one gave it a 5/10, then it would get a lower 90 percent score. As a result, movies like Joker that scored a 7.3/10 average only has a 69 percent Tomatometer score, while Shazam! also has a 7.3/10, yet earned a 90 percent Tomatometer score. While it doesn't even make sense to compare these two particular movies, the discrepancy in the actual review scores compared to the Tomatometer score with the two movies points to a warped formula that penalized Joker for being more divisive.

If audiences simply wanted to know if critical reviews were generally favorable or not before going to the theater, Rotten Tomatoes would be a fine metric. Unfortunately, its ubiquitous use as a scoring system and use in marketing drastically misrepresents the meaning of the Tomatometer and shifts collective film discourse in the direction of "good vs. bad" debates instead of more thoughtful explorations or examinations of the art or entertainment value of a given film. Granted, some films are intended as simple popcorn thrills, but Rotten Tomatoes doesn't distinguish that fact and puts all movies on a simple flattened good vs. bad scale presenting subjective opinion as objective fact.

It's truly unfortunate that such a prominent metric has been manipulated and abused in the ways Rotten Tomatoes has, although it's maybe more unfortunate that it became such an important measurement of quality in the first place. Film is a subjective medium, and reducing the visuals, story, performances, and technical achievements of any given movie to a single flat number should have never been embraced as an acceptable method of evaluation. If there's a bright side to the whole situation, it's that the loss of Rotten Tomatoes' credibility could encourage people to participate in a deeper exploration of the nuances of film instead of simply asking if it's good or bad.