The Problems with Gear Reviews (And How We Try to Avoid Them)
Here at SonicScoop, we run gear reviews every single Tuesday. That’s about 52 of them a year, practically without fail.
More recently, due to great popular demand, we’ve also started considerably ramping up the number of gear roundups and shootouts we run.
(Incidentally, if you’re a working professional in the field want to contribute yourself, reach out anytime!)
As long as we’ve been around, we’ve strived to do reviews with as much integrity as possible, and to remain one of those rare and trustworthy publications where the reviews are always genuine, impartial, thorough, and most importantly, useful to you as a reader.
We’ve never issued any kind of official statement that I know of on just how we handle reviews, and how we avoid the possible conflicts of interest that can arise in a field like ours. Today, I want to take a moment to do just that.
So read on, and we’ll address how our gear reviews work, why we approach them the way we do, and why we think they’re—surprisingly—still relevant in an age of online customer reviews, social media, and forums.
To do that, let’s take a look at some of the alternatives to the kinds of long-form independent gear reviews we do, and weigh their pros and cons. Then, we’ll go over our approach and what makes it different.
We’d also love to hear your feedback on our practices. Please don’t be shy about letting us know what you expect from us, and what kinds of content and reviews are the most relevant and useful to you!
So here goes. First, a look at some of the strengths and weaknesses of the other approaches we’ve chosen to avoid:
The Problem with Forums
When I first started writing for SonicScoop in 2010, I was fed up with pro audio forums. The conflicting advice, the conflicting facts, the controversies that shouldn’t be controversies questionable credibility of many of the posters. (Just who is this “neveguy69”, anyway. And can I really trust his views on API consoles??)
Don’t get me wrong, I loved audio forums for years, and spent far more time than is healthy or productive reading them, and even, *gasp*, commenting on them. (Yes, I am a nerd.) But at a certain point, I was just done.
Perhaps it was always something of a “love-hate” relationship. And maybe you can relate.
When online audio forums first emerged, they seemed like a refreshing alternative to the old legacy print media that had become increasingly ossified and uninteresting over the years. (With a couple of notable exceptions.)
By 2010 however, online forums, which once felt like a scrappy and welcome alternative to the norm, appeared to come to dominate online conversations in the pro audio world, and a darker side started to become apparent.
One of the wonderful benefits of online forums is just how quickly you can get information and opinions from any number of sources and perspectives. The biggest drawback to online forums of course…. is just how quickly you can get information and opinions from any number of sources and perspectives!
When it comes to gear reviews, one of the most obvious problems with online forums stems from the fact that in many cases, posting is effectively anonymous.
That means that the glowing opinion you just read about a piece of gear may have actually been written by a public relations person, or other operative for the company that makes the product.
While the affiliations of the writer don’t make much of a difference when it comes to statements of fact or of logical reasoning, those affiliations do make a big difference when it comes to statements of opinion.
I don’t know about you, but in general, when I’m looking for statements of opinion, I’m not looking to get them from the manufacturer or PR agency behind a product. I already have a pretty good idea of what their opinion is likely to be.
Even when there is no direct conflict of interests at play, and you’re reasonably certain that the anonymous online review or opinion isn’t by an undercover PR agent for the product in question, forums can often exhibit their own weird and sometimes even cultish kind of behavior.
For instance, you’d think that there’s no good reason to praise or bash a product you’ve never actually tried. But you and I both know there are some forum participants who have done both those things. Heck, you or I might have even done it ourselves in a prior life!
(OK, I’ll admit it: Young 20-something me almost certainly did on at least one or two occasions. Let’s hope that’s all that I did wrong in my twenties. OK, it wasn’t.)
So let’s face it: Sometimes people on forums just want to have something to say, and may even comment authoritatively on things they’ve never really tried themselves to any meaningful degree.
It’s an interesting dynamic, really: Sometimes, forum participants don’t have a lot to add on their own and end up parroting second-hand opinions—preferably those that have already been approved by the group.
The actual, or even imagined social reinforcement of this behavior can create a little dopamine kick that leads commenters to engage in what would otherwise seem like some pretty silly behavior.
Sometimes, the more independently-minded members of the crowd may comment on things they have limited experience with as well, and just end up making guesses about what their opinion of a product would be if they had actually used it at any length. Sometimes, people just like to hear themselves type.
And of course, sometimes, users will talk up the gear they already own, simply because they own it and they have a vested interest in keeping its resale value or prestige value high. They may also target what they see as competing products with negative comments.
While all these little quirks of human psychology don’t magically disappear when a a publication hires an independent reviewer to check out a product, there is at least the possibility of creating a system of checks to minimize or even eliminate many of them.
So, are gear reviews and opinions on online forums useful?
Perhaps, and to some degree. They’re probably just about as useful as tips and advice on your career or your craft from people whose credibility and tastes and experience are mostly unknown to you.
Buyer beware. Separating the wheat from the chaff in this environment is possible, but it’s good to be aware of the potential weaknesses.
The Problem with Customer Reviews
Whether it’s on Amazon or your favorite music retailer’s website, the user reviews on retail sites can be a great place to supplement your own product research. But they can also exhibit most of the problems mentioned above—plus some additional ones, too.
These mediums usually exhibit the problems of anonymity as well. Does that 5-star reviewer have a vested interest in seeing the product succeed? Does that 1-star reviewer have a vested interest in seeing it fail? And why do I always have to scroll to the 3-star reviews (if there even are any of them) to find a decently balanced look at the item?
In addition to this, you add on yet another layer of integrity risk: Does the retailer have an interest in removing overly negative reviews, or even middling ones?
Is that even against their terms of service? Or is there enough wiggle room to remove reviews that might reduce the likelihood of a purchase occurring on the site? Even if the retailer has a good deal of integrity here, could they be pressured to remove bad or middling reviews by a brand that feels its product was misrepresented?
Different retailers may have better or worse policies about this kind of thing. (To be fair, similar criticisms can be levied at journalistic enterprises as well, especially if they accept advertising, as most of us do.) But it’s good to be aware of this additional potential area for concern.
In the case of both forums and online user reviewers, even when the stars align and all goes well, and you do find an honest review or opinion of a product you are interested in, and there are no vested interests at play, how can you identify it as such and separate it from all the others?
And, even if you could identify the truly impartial reviews in some cases, is it well written? Is it thorough enough? Does it go over everything you were hoping it would? Does it speak to your potential use case or your concerns?
Don’t get me wrong, I actually love online user reviews. I read them. I use them. I even rely on them to a degree. When the policies of retailer are good, and there is a large enough quantity of reviews of a productive to get a fairly well-randomized sample, they can be very helpful. (That later criteria is a hard one to meet in an industry as niche as ours on all but the most popular products.)
But in general, the best reviews I’ve encountered in my own life, both inside the audio world and out, tend to come from dedicated publications and blogs written by very nerdy people who are thoughtful, articulate, and the exact target market for the product in question. I find them even more useful when they give me an overview of the competing options and help contextualize and compare them for me.
Our Approach
I’m not going to lie to you and tell you that just because a journalistic enterprise is responsible for review, it is necessarily going to be a good one, and that it’s going to be free from undue conflicts of interest.
As the world is increasingly realizing once again, journalism and media in general aren’t sacrosanct and “fake news” is a thing. It’s a good idea to find sources you can trust most of the time, and then keep them on their toes.
These days, when so many publications are overwhelmingly ad-supported rather than subscription-supported, how can a publication keep conflicts of interest and pressure from sponsors from seeping into its reviews?
It’s worth bearing in mind that even in the subscription-like model of Patreon sponsorship and the like, retailers and product-makers can be among the publication’s patrons. Even when it comes to the few publications in our field that do charge subscription fees, they usually get a meaningful enough amount of revenue from sponsors that they will still need to put systems into place to prevent conflict of interests.
I’m aware that some of our competitors essentially sell brands a predetermined amount of positive review coverage as part of a larger ad package. We do not do this, ever.
Rather, we have two distinct types of content:
1. “Reviews”, which are impartial, done by a third party writer with no direct relationship with the brand, and which can not be purchased.
2. “Sponsored Content”, which is done in partnership with a brand, and does not pretend to be an impartial review. This may come in the form of a tutorial or product showcase, in which we explicitly thank and otherwise make clear that we are doing the content in partnership with a brand.
Here are the safeguards we put in place to protect against conflicts of interest for reviews in our own publication. Tell us what you think of them:
1. No writer may write a “review” of a product that they officially endorse or get any kind of payment to promote.
This sounds like it should be a no-brainer, but you’d be amazed a how often I get requests to the contrary!
There are apparently publications out there that allow writers to “review” products, and also make an income off the each product that is sold by means of that “review”. Or, they may get a lump payment from the product-maker for helping to promote the product.
In my book, those aren’t “reviews”. They’re endorsements, or advertisements or sponsorships, which is fine, but I want any publication I read to disclose them as such.
Don’t get me wrong: Advertisements are great, and I’m actually glad they exist. The good ones help me find out about useful things I might otherwise not know about. Likewise, I have no problem with people endorsing products that they believe in and getting paid for helping to drive sales. I just don’t like to hear them described as “reviews”.
On our site, we want the reviewer to work for you, not for the brands they cover, so we pay them a flat fee to write any reviews for the site. They never get paid directly by a brand they are reviewing.
There is one bit of potential grey area here that we are actively considering: If a publication is in a habit of using Amazon links or a retailers’ affiliate links whenever products are mentioned, it could be OK in my view, particularly in the context of a collection of options.
As long as the writer of the review isn’t getting paid based on how much product they are moving, then it is possible to avoid a conflict of interests from the reviewers.
Some major and reputable publications do this. I’m on the fence on this one, and would love to hear your view as a reader. We’ve experimented with the idea on and off, and I although I don’t love it, we have not declared it off limits as of yet.
2. The only perk a reviewer can get from a brand is the opportunity to purchase the review unit or obtain an NFR license for software.
Here’s the one significant perk reviewers can get from a brand: Sometimes, manufacturers will allow a a reviewer to buy a used review product at cost if they like it.
Similarly, software brands may issue a permanent license to a reviewer so they feel comfortable using it on real professional projects, because they know they’ll still be able to recall the session later.
To us, that is not a conflict of interests. Why? Think about it for yourself:
Why would you lie and say you loved a plugin you hated so that you could get it for free? Where’s the benefit?
Since it’s an NFR (which means “not for resale”) you can’t make any money by reselling it to someone else. All you get is a plugin you can’t stand and will probably never use. (Yay, I guess?)
The same goes for buying the used demo unit you evaluated for review: Why would you lie and pretend to like a product you hate just so you can buy it for yourself? At best, you could maybe immediately sell it for what you bought it for, so there’s no material gain. I just don’t see the conflict.
3. We do not “assign” reviews, and no sponsor is guaranteed a review.
Reviews at SonicScoop are writer-driven. Every quarter, we email our contributors with a list of potential products for review that we think seem particularly interesting. They can also bring us any product that they might be interested in checking out.
This turns out to be a great model, because writers are selecting, for themselves, the kinds of products they are the most curious about. They effectively are the target market for the product and want to see if it solves a problem that they themselves have. They’re not just hired journalists who sit at a desk all day and write about gear. They are working pros in the field trying to decide if the products out there actually meet their needs.
Our major criteria for green-lighting a one-off review are:
1) Is it new, within the past year?
2) Are enough of our readers likely to be interested in the product that it makes sense for us to review it?
3) Have we reviewed it already?
Whether or not the brand is a sponsor never comes into play. In fact, many or most of the brands we review are not sponsors and very well may never be sponsors. The best advantage a sponsor gets is:
1) The editors are more likely to know that they have a new product out and are more likely to make sure it gets onto the list for possible selection by a writer, and
2) If they’re advertising in the publication, our writers are more likely to know about and be interested in checking out the product, because the usually read the magazine as well as write for it.
4. No brand is guaranteed a positive review.
No brand or sponsor is guaranteed a review, much less a positive one. That said, you may notice that many of the reviews tend to skew positive, or at least diplomatic, more often than they skew harshly negative.
Larry Crane of Tape Op has some ideas about why this will happen in a publication that is relatively free from conflicts of interest and once ran a piece that tried to explain why that is the case in his magazine. I have a few thoughts about why that’s the case here as well:
1) Since writers select review products for themselves, they are likely to pick products that they want to like, and that they reasonably think might solve their problems.
2) We encourage writers to think not only of their own use cases, but of the use cases of other potential users, who may have different needs. Even when writers end up not liking a product at all for their own purposes, it is part of their job to put themselves in the shoes of readers and think of all possible use cases. If they can think of a user the product would be good for, they wouldn’t be doing their job if they didn’t mention it!
3) Considering how difficult and expensive it is to launch a product—particularly one that seems interesting to working professionals in the field—most products that make it this far don’t totally suck. Even if they’re not right for you (or for the reviewer) chances are they are good for someone somewhere. In the end, a big part of a good reviewer’s job is not to say “it sucks” or “it’s great”. Rather, the biggest part of their job is to speak to a product’s strengths and weaknesses, and evaluate how it fits into the greater context of all the other competing products out there in the world.
4) Over the years, there are only a couple of cases I can think of where a product was so bad that we didn’t run a review. In each of these cases, the review didn’t happen because the writer was so unimpressed with the product that they decided they didn’t want to bother reviewing it at all! They would have just rather spent their energies doing something else instead. We have never cut a review simply because the reviewer didn’t like a product—or because a brand didn’t like the review. And we never will.
4. All reviews get fact-checked.
We’ve just looked at a couple of the simple protections we put into place to help make sure there is no conflict of interests that makes for a disingenuous or falsely positive review. But another potential problem with reviews that’s equally as bad is that they can be inaccurate, incomplete, shortsighted or disingenuously negative.
In the end, reviews that are falsely negative are no better an outcome than reviews that are inappropriately or overly positive. To help protect both the product-maker and the end reader from that outcome, we let product-makers fact check our reviews.
What? We let the product-makers read our reviews before we publish them!? Gasp!? My pearls!
Yes. Absolutely. Why? Because no one knows better than they do about whether we’ve gotten an important fact about the product wrong. If we didn’t do this, we wouldn’t be living up to the level of integrity or added value that our readers expect from us.
We always want to make sure our reviews are factually correct down to the last decimal, and we want to make sure that the reviewer does not misrepresent the feature-set of the product or the design philosophy that went into making it. Plus, if the reviewer has completely neglected to think of a potential use case where the product’s features would make sense, we want to know about it.
Of course, we do sometimes have to be quite firm in letting product makers know that the opportunity to fact check a review is not an opportunity to rewrite the review, or to ask the writer to change his or her opinion of it.
Most sponsors and brands in general get this, and it’s usually not a problem at all. Every once-in-a-while a brand will push a bit harder than they should and we have to push back a little bit. So it’s a good thing we stand between the writer and the brand being covered to help with this.
But every once-in-a-while, the reviewer really did miss something important or make a mistake, and the product maker will bring up a good point. We’d be doing a disservice to readers if we didn’t bring this feedback back to the writer for consideration.
In all the time we have been doing this, I have never heard of a writer who thinks we forced them to change their opinion based on a brand’s feedback. (Honestly, we don’t pay enough for one-off gear reviews where they’d ever let us get away with it!)
An Even Better Alternative?
To be honest, individual gear reviews are not a core part of our business here at SonicScoop, even though we run one each week. Because so many of these products are so niche in their focus, and because there are so many quick-and-easy alternatives to the kinds of exhaustive individual gear reviews we do, this kind of content is responsible for the least amount of traffic we get compared to most other types of content we run.
But just because one-off gear reviews aren’t a huge driver of traffic for us, doesn’t mean we can skimp on how much effort we put into making sure we get them right and do them with integrity.
They’re also a great place to test out new writers. They can be a nice perk for readers who want this kind of in-depth focus, as well as for reviewers who want to be able to evaluate a product at length, and for brands who get a little extra coverage without having to blow up their marketing budget.
Done well, it’s an everyone-wins kind of scenario, even though they get less traffic than some other types of content. As long as we have other types of content that are popular enough to help subsidize them we will likely be able to continue doing them.
That said, from time to time, we have considered the idea of completely eliminating individual gear reviews and replacing them exclusively with gear roundups and shootouts, which are often orders of magnitude more popular with our readers.
In an era of instant user reviews and forums and social media chatter and easily-accessible manufacturers websites, do individual gear reviews really still make sense at all?? Or is the greater service for us to focus our energy on creating roundups and buyers’ guides that help readers navigate a whole variety of products? It’s an open question. You tell us.
Regardless of their low traffic compared to the amount of work that goes into them, we think there’s a certain un-quantifiable value to running individual gear reviews for us, for our readers, for product makers, and for the pro audio community at large.
That said, I’m sure there are a lot of recording studios out there that thought they had some great un-quantifiable advantage that are currently out of business! So it’s always worth it to question our own assumptions.
What do you think? Do you feel our standards for reviews are up to snuff?
Do you think we should keep doing individual gear reviews or focus our energies on providing you with different types of content?
Do you just want to brag that you made it to the end of a 3,900 word article about gear reviews? Wow you really are an audio nerd.
Whatever your view, thanks for reading, and please tell us all about it in the comments below!
Justin Colletti is a mastering engineer, writer and educator. He edits SonicScoop.
Please note: When you buy products through links on this page, we may earn an affiliate commission.