While I realize this might be a bit far afield from our normal topics, bear with me for a moment. If you’re a consumer of podcasts on the web or if you purchase products and services from online vendors, you’re probably familiar with the nearly ubiquitous five-star rating system employed by so many sites. Web users can also assign similar ratings in a variety of other venues, such as submitting reviews on Google, Yelp or other browsing services. While this started out as a very positive development and one that I’ve happily participated in for years, this system has now devolved into something of a freakshow, offering little value to the consumer and providing endless frustration to the providers who live and die by these ratings.

Allow me to offer a recent example before explaining myself. Some weeks back I was listening to one of my favorite podcasts while preparing my lunch. (I won’t call them out by name here.) At the beginning of the show, the host took the time to thank listeners who had recently rated and reviewed his podcast. Then his tone took a bit of a turn when he complained about some listeners who had given him some compliments but assigned the show a three or four-star review. He said, (paraphrasing from memory) Look. You need to understand that anything less than a five-star review is a bad review.

I happened to be listening to the show on speaker rather than my earbuds and my wife was in the room when that remark was made. She immediately responded by saying, “That’s total nonsense.” (Okay, she actually said something a bit less refined, having to do with the excrement of male bovines, but we try to keep this place at a PG-13 level wherever possible.) And she’s totally correct.

The thing about the five-star rating system that was beautiful in the beginning, at least for users like me, was that it allowed people to write thoughtful reviews and make distinctions along a scale of quality. Specifically, when it comes to podcasts, I used to try to reserve my five-star reviews for the absolute best shows I found. They were the ones in my regular, weekly rotation list that I considered “can’t miss” in terms of their quality and how much they applied to my personal interests. (I’ve never bothered reviewing content covering areas I don’t care about and wouldn’t investigate.) But there are other podcasts that I find to be enjoyable on most occasions, with the occasional issues that might put me off, but not enough to have me drop them entirely. I would assign those shows three or four stars accordingly.

I rarely do bad reviews. If the show is so bad that it’s a total turnoff, I just don’t listen. But if the sound quality is so horrible that it’s unlistenable or the content is so riddled with easily debunked errors as to be little more than nonsense or propaganda, sure… I might toss it a one-star review. The whole point here is, why bother having a five-star rating system if you’re a “bad person” if you assign anything but a five or a one? Why not just have a “thumbs-up, thumbs-down” rating system? It’s a disservice to the really good podcasts that the listener honestly feels are head and shoulders above the rest and an equal disservice to other listeners who might base their choices on a preponderance of such reviews.

Don’t get me wrong. I understand where the content providers are coming from also. Most consumers, when looking for new podcasts, tend to get them in one of two ways. It’s either suggested by someone else they listen to so they search for a particular show specifically, or they browse the content by category. And browsers are notoriously averse to doing a lot of scrolling. If the shows are ranked by ratings and your show’s average isn’t at or near the top, many prospective listeners will likely never see it, so your potentially great show withers in anonymity. I get it. But that still doesn’t do anything about the relative quality question.

Also, the graded rating system is always open to abuse. Yelp is notorious for having restaurant personnel and their friends launch “attacks” of one-star reviews on their competitors no matter how good the quality might be. The same can be done for people on E-Bay or any other online vending system.

Back in the day, Rotten Tomatoes came up with something promising. It was a way for the public to provide their own ratings on entertainment as a herd, rather than relying on the opinions of elite film reviewers whose evaluations could be based on criteria that didn’t apply to you. But in the end, even their pool wound up being poisoned by “popularity campaigns” launched by people with broad social media influence and reach. Perhaps, in the end, we should all consider just adopting YouTube’s “thumbs-up, thumbs-down” rating model. It’s far less subtle, but if there are enough reviews, browsers can at least get a sense of how many people liked it versus finding it useless. It’s not a beautiful solution, but it’s probably still more useful than the current crap festival that the five-star system has become.