You can probably trust any reviews written by this guy!
Recently, our own Carly Z pointed us GD folks to a very interesting and, for me, thought-provoking article about online reviews and blogs published to the web that contain reviews–like this here one that you’re reading–by Al Gauthier of “Living Barefoot”. (N.B.–I don’t follow the “living barefoot” guidelines, but if you saw me paddling around my workplace in my stocking feet, you would know that I’m closer to it than a lot of high-tech types.) Gautier’s (fairly open) question was, in an online world where there’s no barrier to creating blogs and to publishing your opinion, what is the value, if any, of an online-published reviews?
And it got some of us here at GD thinking. There’s no question that centrally located, unfiltered reviews are a pretty mixed bag at best, and are often invaded with fake “reviews” by the software makers themselves trying to make their products look good. It’s what math statistics nerds call a “self-selected universe”, which generates notoriously unreliable data. There are also, as Gautier pointed out, web sites popping up all over to put out “reviews” that are really nothing more than a front for the companies themselves, trying to make their products look good. Or conversely, trolls (or competitors!) could do the same thing, only trying to make the products look bad. How do you know?
So the question is, how do we, the ordinary schlub consumer, find reliable reviews of products that we’re interested in?
In my case, when I read reviews of anything that are posted in a central location–Amazon, iTunes, or someplace like that–I sort by “Most Negative” and only read those. The ratings themselves are not that useful–what’s a feature for one person may be a bug for another–but negative reviews at least list the things that bugged the reviewer, whereas the “THIS IS AN AWESOME APP–BUY IT NOW!!!!!” reviews are, essentially, useless. But if I know what people thought were the worst things about a product, I can at least then make an informed decision.
Another important point is the distribution of the rankings. If something only gets 5 star and 1 star rankings, it’s best to be suspicious–a lot of the five-star rankings may be marketing fluff put in there by the company and their proxies, and the 1 stars may be just because someone is irritated because in version 1.1 the app crashed, but the app is on version 2.3 now.
Finally, I have built up a list of sites and reviewers whose opinions I trust. If Dan of Gear Diary, who has many times expressed his admiration for various Apple products, reviews an Android device and gives it a rave, that’s significant. If Carly Z tells you an eReading app is excellent, that’s significant. And so on.
The key is to get the data. What are bugs reported against the app? Are there problems in one version, but those problems were cleared up in a subsequent version? What do your “trusted list” of reviewers say about it? Did Julie of The Gadgeteer love it, but that idiot Doug of Gear Diary hate it? Maybe that’s just because Doug hasn’t liked a new product since the Palm Pilot 1000 blew his hair back–you have to take those things into account. And then when you have all the data, make your purchase.
At least, that’s how I do it. How about you?