View Single Post
  #23 (permalink)   Report Post  
Posted to rec.food.drink.beer,alt.beer
Dave Witzel Dave Witzel is offline
external usenet poster
 
Posts: 8
Default Ratebeer v. Beer Advocate

Ernest > wrote on 24 Jul 2007:
> Dave Witzel > wrote in
> 46.128:
>
>> ...while Ratebeer has the ticker mentality run rampant; cogent,
>> thoughtful reviews of beers are pretty much frowned upon

>
> Your last comment is a bit backwards; if anything, the reviewers
> that write longer (either more poetic or technically
> descriptive) reviews tend to get a lot more credit and "props"
> from fellow members in the forums.
>
> But RateBeer has a ton of stats available (unlike BA), which can
> lead to some users pursuing number-goals beyond just the total
> number of beers they've tried. In some cases that can lead to
> ticking, but certainly not as a rule.


There are two sets of people on both sites: tickers, and people who
have the taste buds and grey matter to write cogently on the
thousands of beers they've had over the years. Ratebeer is set up
to encourage "ticking", and threads in their forums/groups of them
at beer events tend to foster that mindset.

Of course not all prolific reviewers are tickers, but when one has
seen, firsthand, the ticking ways, it's harder to take the site
seriously. And I kind of *like* Ratebeer.

(next section deleted as it, on further review, reinforces what I
wrote anyway)

>> wealth of reviews that get posted from a single bottle of the
>> rare beer of the moment (think 20+ reviews from one bomber),

>
> You're exaggerating a bit there (even in shared-bottle
> situations, it's rare for people to take less than 3 or 4
> ounces, so even in extreme cases you'd not get more than 6 or 7
> people on a single bomber), but this is certainly an issue that
> gets argued. Even more controversial is rating from single 1-oz
> samples at GABF. The question comes down to "how large of a
> sample size, and how many of them, does it take to get a 'good'
> rating?" The answers range between both extremes, and is
> different for every person you ask.


When I see multiple dozens of reviews based on the same growler of
a beer -- and this is easy to track when you tie it to an event,
such as Dark Lord Day -- or when you see a gathering of seven or
eight huddled around pints of beer shared at a table at a Pizza
Port event or a bar hosting a day of beers not normally available
in the area... you tend to draw conclusions.

Again, naturally, there are those, undoubtedly like yourself, who
take thoughtful notes when the situation presents itself and don't
whip out the notepad every time they go out.

>> gauge the "best beer" ratings; either could be used to gauge
>> what the tickers are lining up for, however.

>
> Not really. Again, it's not about the tickers. What you mainly
> end up with on the "top" lists are a bunch of full-throttle
> beers that are bold, hoppy, etc. (Impy stouts, barleywines,
> DIPAs, etc.). It's not a tickers' top list, it's a list of the
> boldest beers with the sexiest, most aggressive aromas/flavors.


It's a herd mentality, in many cases. Scores appear to be defined
by, well, what you said plus the difficulty in obtaining the beer.
Someone, or some brewery, develops a reputation, so therefore every
beer put out in limited release automatically becomes the best beer
since Jim Cibak created the universe. Some awful messes get ranked
really, really highly due to the difficulty and/or expense of
obtaining them. It's distressingly obvious, and a big reason why
any set of rankings will always always always need to be taken with
a big ol' salt lick.

But I doubt I'm bursting anyone's bubble here.

Witzel