The rankings were first published in Brass Band World (now BBW) magazine in October 1991.

As Colin Archibald said in his article, the inspiration for the rankings was the method used at that time to rank professional golfers!  In fact, a few of the 'tweaks' to the methodology that were introduced to the golf rankings over the next few years were present in the band rankings from day 1 so the BBW rankings were truly innovative and ahead of their time!

In essence, bands are awarded points for competing in so-called 'ranking events' and the points are accumulated over a 3-year period, weighted in favour of the most recent events, and averaged over the number of contests that the band has played in during the 3-year 'rankings period' (see worked example below).

The technical bits

In the early days of the BBW Rankings, a 'ranking event' was a contest that featured at least one top 20 band but for a long time the definition was a contest with at least one top 50 band in the field.  Now, with the expansion of the rankings to include 'international' bands, a ranking event is one that includes at least one band ranked in the top 100.

The number of top 100 bands taking part in a particular contest determines the 'strength of field' for that contest and hence determines the number of ranking points available for each placing in the final result (the ranking of the bands present also has an effect, for example a contest featuring the top 10 will be a lot 'stronger' than one that has 10 more lowly ranked bands in the line-up).

The ranking points earned by each band are then 'weighted' to give the most prominence to the most recent month, with the weighting eroding slowly over a 3-year period.  On average, the weighting in each 12-month period is 3 times that of the preceding 12-month period.

The total weighted ranking points earned are then divided by the number of contests played, subject to a minimum divisor of 9.

Fundamental to the success of the rankings is that bands compete against each other often enough so that an accurate picture of their relative merits can be established.  If we think of three bands - A, B and C - then it doesn't matter if Band A and Band C never meet during the 3-year rankings period provided that there are enough contests featuring bands A and B and also contests featuring bands B and C.  Considering all the various results permits a reasonable idea of how bands A and C relate to each other despite them not having met!

There is a limit to how well the rankings work though and the relative rankings of bands will be most 'accurate' for those bands that compete often and, especially, across a variety of geographical areas.

A worked example

As a means of illustrating the 'mumbo jumbo' of the description above, let's look at the points average for the Fairey band as at 1 October 2009.  Fairey's points average was 18.64, placing them 11th in the rankings at that time.

Clicking on Fairey's name in the October ranking reveals their 3-year contesting record to that date and the ranking points awarded.  How do we arrive at the 18.64 average though?

Here is a table showing the month-by-month breakdown of all 36 months in the 3-year rankings period.  You can see the unadjusted and adjusted weights applied to the ranking points awarded (note how the adjusted weights average 0.5, 1.5 and 4.5 respectively for each of the 3 years) so that the 4.70 points earned for Fairey's 12th place in the 2009 British Open becomes 31.15 in the rankings formula.  Fairey's played in 16 contests over the 3 years to 1 October and dividing the 31.15 by 16 gives the 1.95 contribution to their monthly ranking.

The averaged (and weighted) ranking points earned for a particular contest recede in importance month-by-month until it disappears from the rankings completely after 36 months.  In the table below, you can see how much less significant (in rankings terms) the 2006 National Championship result has become relative to the more recent results.  One month later, the 2006 national contributes nothing to Fairey's ranking.

month unadjusted
  contest result raw
36 86.16 6.63 Year 3
weighting = 4.5
British Open 12 4.70 31.15 1.95
35 81.13 6.24          
34 76.10 5.85          
33 71.07 5.47 English National 7 9.00 49.20 3.08
32 66.04 5.08          
31 61.01 4.69          
30 55.98 4.31 North West Area 2 12.70 54.69 3.42
29 50.95 3.92          
28 45.92 3.53          
27 40.89 3.15          
26 35.86 2.76 Brass in Concert 5 13.50 37.24 2.33
25 30.83 2.37 National Championship 5 18.40 43.64 2.73
24 25.80 1.98 Year 2
weighting = 1.5
British Open 15 2.60 5.16 0.32
23 24.65 1.90          
22 23.50 1.81          
21 22.35 1.72 English National 9 7.10 12.21 0.76
20 21.20 1.63 All England International Masters 9 6.10 9.95 0.62
19 20.05 1.54          
18 18.90 1.45 North West Area 3 10.10 14.68 0.92
17 17.75 1.37          
16 16.60 1.28          
15 15.45 1.19          
14 14.30 1.10 Brass in Concert 4 14.10 15.51 0.97
13 13.15 1.01          
12 12.00 0.92 Year 1
weighting = 0.5
British Open 7 12.50 11.54 0.72
11 11.00 0.85          
10 10.00 0.77          
9 9.00 0.69 English National 9 6.10 4.22 0.26
8 8.00 0.62 All England International Masters 8 6.30 3.88 0.24
7 7.00 0.54          
6 6.00 0.46 North West Area 5 5.40 2.49 0.16
5 5.00 0.38          
4 4.00 0.31          
3 3.00 0.23          
2 2.00 0.15 Brass in Concert 5 11.50 1.77 0.11
1 1.00 0.08 National Championship 7 12.50 0.96 0.06
              points average 18.64

Don't take it too seriously!

For over 20 years now, the rankings have attempted to portray the relative merits of the leading bands in mainland Britain and (from January 2012) overseas.  Remember though that no rankings system is perfect and after all, all systems can only ever be based on contest results, which are themselves a result of a largely subjective 'art' - the adjudicators' opinions!  Brass bands are not like golfers, or tennis players, etc, for which the results they achieve can lead to an accurate ranking, at least on that particular day.

The rankings will always (I hope) lead to some healthy debate in band rooms across the country and if you don't agree, just blame the adjudicators as we've always done!