Barley Mowat 

A Word on the BC Beer Awards

with 9 comments

This Saturday past saw my shadow grace the insides of Chapel Arts for the third annual BC Beer Awards and CAMRA’s Harvest Cask Festival. Being a two-for-one show in a new venue, I figured I should complain about the various bits of all this individually, so they can each get the attention deserved.

Venue: Chapel Arts is a great location with lots of character. It’s a former Chapel (duh) with lots of spacious rooms, and the kind of nooks and crannies you just don’t see in modern housing. They even opened up the garage to push in a food cart for dispensing non-barley based nutrition. Of course, this was somewhat disappointing as the invite rather explicitly promised us two food carts. Oh well, my Re-Up sammich was tasty.

How’d it stack up for a cask fest? I’d give it a solid pass. The space was attractive, interesting and it contributed to a cosy, intimate feel, but it also conversely made movement between the three main cask rooms and food area sometimes difficult. Overall, though, I liked it.

CaskFest Organization: CAMRA Vancouver did a good job with organizing this one. Tickets were available from a non-crashy website at a decidedly non-midnight time (cough, cough, VCBW), and entry into the event was not hampered by long lineups (cough). As well, given the size of the space involved, I didn’t feel it was oversold. There’s always a risk that the promoter will get a bit greedy and just keep on selling tickets, but even though this was sold out there was rarely a lineup at cask stations, and you never felt rushed while chatting with whomever was manning the brew, something I in particular look for at these events.

In fact, about the only complaint I have on the organizing side of things is the missing food cart. I love Re-Up, but the other garage door weeped gently with the lost food possibilities the promised second food cart would provide. Sure, there was a bar upstairs with a quality cheese platter on offer, but I didn’t see any wheels on that sucker.

Brewery Participation: 24 beers from 21 brewers. I should be happy, right? Nope. I’m disappointed. The reason is that there was a surprising lack of casks for a CASK festival. Call me crazy, but I think putting the word CASK right in the title sets a certain expectation. Of there being CASKS. CASKS!

I was expecting more casks, is what I’m saying. CAAAAASKS!

Of those 24 beers, only 15 were casks. And of those 15 casks, only 9 were not simply cask conditioned versions of the brewery’s normal beers, and that makes a sad Chuck. Try harder, people.

Cask Highlights:
1/ Red Truck Kellerbier — While technically not a cask, this was definitely a unique one-off, and nearly perfectly executed. My vote for best in show. Smooth yet full of flavour. Chuck likey.
2/ Spinnakers Fresh-hopped Saison — I wouldn’t have guessed that fresh hops and a saison would work together, but the result was like summer in a glass: fresh, fruity, and a joy to drink.
3/ Lighthouse Belgian Quince IPA — The beauty of a cask is the ability to fiddle around and try new things. This beer is exactly that. The quince and NZ hops created a massive fruity body which I was not a huge fan of, but it did garner People’s Choice for Best in Show. While I didn’t love the beer, I loved the idea behind the beer.
4/ Storm Imperial Sour Cherry Stout — A well balanced sour from Storm. Wha? I had no idea they could do something subtle.

Cask Lowlights:
1/ Coal Harbour Sour Roggenweizen — I cannot stress this enough: finish fermenting your beer before you serve it. I tried the on-tap version at the Alibi immediately afterwards and, while still not a great (or good) beer by any stretch, I didn’t immediately pour it out, like I witnessed many many other people do with the cask version.

CMON! (breweries without casks):
1/ Big Ridge (Tariq’s ESB)
2/ Hoyne (Wolf Vine)
3/ Old Yale (Sasquatch Stout)
4/ Steamworks (Pilser and Espresso Stout)
5/ Townsite (Porter)
6/ Tree (Jumpin Jack Pumpkin)
7/ Yaletown Brewing (Oud Bruin)

Try Harder (breweries that only cask conditioned a regular beer):
1/ Coal Harbour (Sour Roggenweizen — although I guess making it gord awful counts as a one-off?)
2/ Driftwood (Sartori — Although this gets a pass for being rare)
3/ Granville Island (Pumpkin AND Fresh Hopped ESB)
4/ Parallel 49 (Lost Souls Choco Pumpkin Porter)
5/ Phillips (Accusation)
6/ Vancouver Island (Iron Plow Marzen)

I dunno, guys, pee in it or something. Maybe stop off at Dan’s Homebrewing on the way to Chapel Arts and buy some coriander? How about ANYTHING!

The Awards:

I know what you’re thinking: we’re about to get ourselves some good old-fashioned Chuck beer nerd ranting. I mean, Townsite and Coal Harbour win first place in their categories? Steamworks Pilsner gets Best in Show?

Sadly, though, I know enough about how the awards process and how judging was done to know there’s not foul play afoot here at all. That doesn’t mean that Coal Harbour is suddenly brewing amazing beer, just that the process favoured them. How so?

First, let’s do our background homework and go look at the winners, courtesy of Urban Diner.

Now, let’s learn a bit about how beer judging works. Beer judges (especially BCJP cerftified judges) aren’t judging beers based upon how much they like them. They’re judging them based on how closely they’re brewed to the ideal beer in that particular style. It’s kind of like judging art based on how much it looks like the Mona Lisa. It makes sense in a certain way, but I’m not sure it’s the best way to reward innovation.

Let’s take IPAs, for instance, which is BCJP Style 14. These are broken into three sub categories: English IPA, American IPA and Imperial IPA. Go read those descriptions. You know what does not fit that description? Most of the great BC IPAs, like Driftwood Fat Tug, Tofino Hop Cretin, and Lighthouse Switchback. Those guys differ in at least a few key ways, usually in terms of body or hop style.

For instance, no matter how much we want it to be otherwise, this is not going to win Most Practical Transport.

You know what fits that style? The beers that won. Central City Red Racer is about as fine an American IPA as I can imagine. CC Imperial IPA likewise for imperials. And while it caused some commotion on the floor, Derrick Franche up at the Whistler Brewpub puts together a mean American IPA. I think CC’s IPA is better, but each batch is different and I have no problem imagining Derrick’s was better than Gary’s on the day of judging (although it could be said that the CC IPA is too aromatic for the style).

Then there’s the problem of the blinding. The tastings were double blinded, so that both the tasters and the person serving the beers had no idea which beer was in which glass. The idea here is to prevent brand bias. If you took a CC IPA and poured 1/2 into a glass marked “Central City” and 1/2 into a glass marked “Bowen Island” you can guess what would happen. Blinding prevents that… in theory.

The problem comes we look at the numbers of beers in those categories. Some have 20, 30 or even 40 entrants, but some have only a handful. When this happens, the awards organizers group similar styles together for judging, but in order to judge the beers fairly you have to tell them what the style is (remember our style guidelines from before). Now let’s look at the Sour/Brett category.

First, there were only four entrants, which means any beer had a 75% chance of winning right off the bat, but there’s another issue: each of these beers is a different style. Picture this: you’re judging a sour beer in BC, and I put three glasses in front of you, all unlabelled, but I tell you what style each is. One is a “Oud Bruin” (yes, that’s actually a style), one is a “Flanders Red”, one is an “Imperial Flanders Red” and the last is “Some awful crap made over on Triumph Street.” See where I’m going with this? There’s no way a judge from BC wouldn’t immediately know who made which beer. Sure, not all judges were from BC, but many of them were, and as a result the accuracy of the rankings is heavily diluted.

Combine all those things together and the awards are pretty much what I’d expect: random. More narrowly defined categories with lots of entrants are going to be more accurate while everything else is a coin toss. The takeaway? Steamworks makes a pilsner which is pretty much a picture-perfect pilsner, and perhaps Coal Harbour’s Smoked Ale is worth another look…. nah….

Written by chuck

October 15th, 2012 at 6:36 pm

9 Responses to 'A Word on the BC Beer Awards'

Subscribe to comments with RSS or TrackBack to 'A Word on the BC Beer Awards'.

  1. Speaking as one of the judges of the awards (I have a BJCP Provisional ranking right now, still working on my full qualifications), I agree with most of the criticism in terms of outcomes, and some in terms of process. Here are my thoughts on the whole thing:

    1) How do you empirically say “these judges are qualified”? BJCP qualification is an attempt to do that. Otherwise you run the risk of putting together a room of people who may or may not even be able to tell you whether a beer is malt- or hop-forward or not, let alone whether it’s flawed and how.
    2) If you use BJCP as your guideline, you do tend to initially measure a beer up to how closely it matches the style. Some are obviously flawed and worth rejecting outright, others may not be a perfect match though they have something intangible about them that keep them on your list. The goal of that part of judging is to, as objectively as possible, find the standouts from a flight.
    3) The standouts then advance to a mini-best of show round within their category. There could be 3, there could be 10, it depends on how many beers were entered. Within that best of show you talk through the merits and weaknesses of each one, and use a combination of objective and subjective criteria to determine which should be eliminated until you’re down to three. Then you choose which order the three should be placed. Different judges will have different criteria at this part of the process, some will vote up most-to-style beers while others will vote up beers that they personally enjoy more. Neither of those are really wrong, and if you get a diverse enough set of talented judges this should balance out.
    4) Throughout all this, you do your best to not let recognizing a beer bias your judgement. You would be surprised — no, shocked — to find out some of the beers I recognized but consciously eliminated/advanced due to weaknesses/merit. You’re judging the beer that was poured, not previous times when you know it has been far better/worse. If it doesn’t stand out in the flight, it should rightfully be eliminated. (And the reverse if it does.) Sometimes this can explain why a beer that has won in past years didn’t win this year.
    5) Yep, in BC we just don’t find a lot of interpretations of some styles. In some cases we don’t have enough of a certain style to even round out a larger parent category. (See last year’s grouping of sour beers with fruit beers. At least we’re past that this year.) Your options: automatically give a first place medal to someone just for showing up, don’t give any medals in those categories and ignore the beers that were entered, or use some discretion to put related beers together and try to judge each on its own merits. None are ideal, but I think the last option is the best you can do in that circumstance. The BJCP specialty beer category is the established model we’re following here; you can’t really compare a winter warmer to a Belgian IPA to a Sticke altbier, but they provide guidance for how to do it anyway and that’s the way it works in competitions:
    6) The results depend on the luck of the pour as well as skill of the judges. Judging early in the day when your palate is fresh is easier, judging later in the day when you’ve been through multiple flights gets a bit harder. Organizers balance that by starting with the more subtle beers and working up to the more flavourful ones, but it’s still a factor. Breaking up the day into multi-day judging sprints would be silly, the only alternative here is having more qualified judges there. We’re getting more all the time, should be easier in a few years.
    7) “how did Y win when X is clearly a superior take on the style?” — fair question, except if X didn’t enter, it can’t possibly win. Not all breweries enter. Also, if a great beer is entered in the wrong category, it’s probably not going to win. This happens more often than you’d think. It’s not always the brewers themselves who are the ones handling entries, it could be a marketing person. Think a marketing person is likely to nail beer categories?
    8) I didn’t agree with all the outcomes either. Many were baffling. I remember advocating for some beers that I thought should have been placed higher up, and being outvoted by other very qualified judges. Hey, that’s taste for you, everyone has different preferences. Who’s right, them or me? Some people would have agreed with me, some people will agree with them. Even with BJCP as the yardstick, it gets subjective. As well it should. This is beer, not math.

    Maybe the biggest problem is that some of these behind-the-scenes factors aren’t exactly public knowledge, so people are free to look at the outcomes and speculate about how the results were arrived and assume that was flawed. Maybe this comment will help a bit.

    Personally, I’d like to see a full list of all breweries who entered each year so that at least the “why didn’t X win?” question is cleared up. This seems like low-hanging fruit that the organizers should be able to easily address.

    For another judge’s take, see Ben Coli’s post:

    Dave S.

    15 Oct 12 at 20:02

  2. Judging professional Beer is a different proposition to judging homebrew in my opinion, and my feeling about how the BJCP has been set up, is that it is still a homebrew judging set-up.

    The style guidelines vary in their accuracy and relevence to actual beers, particularly European styles. Some of these guidelines have been completely plucked from thin air and seem to have arbitrarily been made up. (I quote “Robust and Brown Porter” being a classic nonsense. it is a much repeated so-called style that has no basis in history) see Martyn Cornell’s book “Amber Gold and Black” for a British beer and brewing historian’s description of this style and its evolution. -(Don’t get me started on ‘English’ IPAs with beers winning those categories that have no English hops in them)

    The other problem with “brewing to style” is that it does not take into consideration the evolution of beer and the changes to it. A so called “classic” Porter is what? A porter with the grain bill from the early 1700s when this beer was first developed and mentioned or from the 1920s when it was beginning its death throws in the UK? different grain bills alcohol and therefore flavour. both are historic but which is “Classic”?

    My information from a Judge who will remain nameless is that the crazy discussions after the beers were tasted ended up with whoever had the loudest voice tended to get their way (not my imformant BTW!)

    Why weren’t the marks just tallied, so that no personalities came into play? Each judge is as important as the other. this tends to be how wine and spirits are judged. There is some discussion if two have very close marks for the top spot. Usually they are re-evaluated blind to double check the results.

    Also sponsors should not have anything to do with the judging process. This is a pretty basic rule to ensure fair play. Just look at the Diageo / Brewdog debacle in the UK a few months ago:

    Anyway the BJCP way is not the only or best way and this homebrewing judgeing program cannot be where a qualatative assessment of beer in professional awards. There are beverage industry standards that are long established, let’s not try to re-invent the wheel

    The Beer Wrangler

    16 Oct 12 at 12:32

  3. I really enjoyed reading this post instead of working this morning.

    Dave, your insights into the judging process are worthy of their own post. Next time I suggest employing fisticuffs to resolve any disputes, aprons vs gloves meets beer tasting.

    I would really like to see a list of all the beers entered into each category. I’d also like to see a list of all the judges. If Ben Coli was a last minute replacement, I should be able to get in by “delaying” a few of the judges next year.

    I also think Brewery Creek should just enter every beer BC beer. I’d like to be sure Dead Frog’s collection of lagers didn’t win because they’re terrible, not because they didn’t enter.


    17 Oct 12 at 11:57

  4. Huge thanks to Dave and the Beer Wrangler for writing such massively informative comments. If this keeps up I won’t need to say anything in the initial post!


    17 Oct 12 at 12:25

  5. Dead Frog did enter the contest. Notably absent was Russell. It would be nice if every BC brewery entered, but you have to respect a brewery’s right to abstain from entering if they don’t like the contest.

    Whatever flaws the process might have, however, I absolutely reject any insinuations about judge bias, dirty tricks or corruption. The judges were blinded and while we could recognize some of the beers we judged, many of us were surprised by the beers we ended up choosing. It’s not as though any single brewery made a surprise sweep of the awards.

    At the end of the day, this contest is too unimportant for anyone to bother trying to corrupt it. It really is just a bunch of beer geeks gathered around arguing about which beer is best.

    I ended up judging because someone had a stroke. I’m pretty sure the stroke wasn’t caused by agents of a sinister craft brewery trying to influence the results.

    Ben Coli

    17 Oct 12 at 12:38

  6. @Ben I agree 100% with pretty much everything you said there.


    17 Oct 12 at 12:42

  7. Did someone make insinuations about judge bias, dirty tricks or corruption? I wasn’t trying to insinuate any such thing. I’m interested in who did the judging purely because I’m interested. Gerry sent me a list of the judges and I’m going to post it this evening. I admire his transparency.


    18 Oct 12 at 15:12

  8. […] to ask me to be a taster next year. For a few first hand accounts of the tasting process, check out this and […]

  9. […] to ask me to be a taster next year. For a few first hand accounts of the tasting process, check out this and […]

Leave a Reply