Search MAFLOnline
Subscribe to MAFL Online

 

Contact Me

I can be contacted via Tony.Corke@gmail.com

 

Latest Information


 

Latest Posts
Monday
Aug302010

A Competition of Two Halves

In the previous blog I suggested that, based on winning percentages when facing finalists, the top 8 teams (well, actually the top 7) were of a different class to the other teams in the competition.

Current MARS Ratings provide further evidence for this schism. To put the size of the difference in an historical perspective, I thought it might be instructive to review the MARS Ratings of teams at a similar point in the season for each of the years 1999 to 2010.

(This also provides me an opportunity to showcase one of the capabilities - strip-charts - of a sparklines tool that can be downloaded for free and used with Excel.)

In the chart, each row relates the MARS Ratings that the 16 teams had as at the end of Round 22 in a particular season. Every strip in the chart corresponds to the Rating of a single team, and the relative position of that strip is based on the team's Rating - the further to the right the strip is, the higher the Rating.

The red strip in each row corresponds to a Rating of 1,000, which is always the average team Rating.

While the strips provide a visual guide to the spread of MARS Ratings for a particular season, the data in the columns at right offer another, more quantitative view. The first column is the average Rating of the 8 highest-rated teams, the middle column the average Rating of the 8 lowest-rated teams, and the right column is the difference between the two averages. Larger values in this right column indicate bigger differences in the MARS Ratings of teams rated highest compared to those rated lowest.

(I should note that the 8 highest-rated teams will not always be the 8 finalists, but the differences in the composition of these two sets of eight team don't appear to be material enough to prevent us from talking about them as if they were interchangeable.)

What we see immediately is that the difference in the average Rating of the top and bottom teams this year is the greatest that it's been during the period I've covered. Furthermore, the difference has come about because this year's top 8 has the highest-ever average Rating and this year's bottom 8 has the lowest-ever average Rating.

The season that produced the smallest difference in average Ratings was 1999, which was the year in which 3 teams finished just one game out of the eight and another finished just two games out. That season also produced the all-time lowest rated top 8 and highest rated bottom 8.

While we're on MARS Ratings and adopting an historical perspective (and creating sparklines), here's another chart, this one mapping the ladder and MARS performances of the 16 teams as at the end of the home-and-away seasons of 1999 to 2010.

One feature of this chart that's immediately obvious is the strong relationship between the trajectory of each team's MARS Rating history and its ladder fortunes, which is as it should be if the MARS Ratings mean anything at all.

Other aspects that I find interesting are the long-term decline of the Dons, the emergence of Collingwood, Geelong and St Kilda, and the precipitous rise and fall of the Eagles.

I'll finish this blog with one last chart, this one showing the MARS Ratings of the teams finishing in each of the 16 ladder positions across seasons 1999 to 2010.

As you'd expect - and as we saw in the previous chart on a team-by-team basis - lower ladder positions are generally associated with lower MARS Ratings.

But the "weather" (ie the results for any single year) is different from the "climate" (ie the overall correlation pattern). Put another way, for some teams in some years, ladder position and MARS Rating are measuring something different. Whether either, or neither, is measuring what it purports to -relative team quality - is a judgement I'll leave in the reader's hands.

Monday
Aug302010

The Eight We Had To Have?

This blog addresses a single topic: amongst the eight teams that won't be taking part in the weekend's festivities, are there any that can legitimately claim that they should be?

In short: I don't think so, though the Roos do have a prima facie case.

Exhibit A: a summary of the bottom 8 teams' performances against the finalists.

None of the non-finalists defeated finalists during the season even close to half the time. In fact, amongst the eight of them they mustered only 9.5 wins from 44 games during the second half of the season.

Essendon has the best overall record, 5 wins and 9 losses, but four of the five wins came in the first 10 rounds of the season, after which the Dons went 1 and 6 for the remainder.

The Dons do have some justification for feeling a little aggrieved, however, in that they faced teams from the top 8 on 14 occasions, which is twice more than any other team in the competition. But even if you swapped two or three of their tougher fixtures for more winnable encounters and if you assume that they won them all, they still fall short of the 44 points needed for a spot in the eight.

(By the way, Essendon's difficult draw was something that we noted before the season even commenced.)

Adelaide have the next best record against the finalists - which is one of the reasons their MARS Rating is above 1,000 - but a win percentage of 27% hardly screams "injustice". No other team racked up better than a 1 in 4 performance.

The generally dismal performance of the bottom 8 teams when playing teams from the top 8 hints at a fairly strong divide between the finalists and the non-finalists. This next graphic, I think, provides additional supporting evidence for this view.

Each pie depicts the win percentage that the relevant team recorded when playing teams from within the top 8 (left-hand pie) and from outside it (right-hand pie).

Scanning the left-hand pies you can see how much stronger, generally, is the performance of teams from the top 8 when playing other teams from the top 8 than is the performance of teams from outside the top 8 when playing the finalists.

The comparative performances of the two teams on either side of the finals barrier - Carlton and the Roos - is interesting. The Roos, apparently, have a better win percentage than Carlton when playing teams from within the 8 and when playing teams from outside the 8. How can that be when you consider that both teams finished with 11 wins and 11 losses this season and so must have the same overall win percentage?

It all comes down to ... the unbalanced draw (there's a topic I've not railed about for a while). Carlton have a 20% record against top 8 teams and a 75% record against bottom 8 teams, but met top 8 teams on only 10 occasions and bottom 8 teams on 12 occasions. The Roos, on the other hand, have a 25% record against top 8 teams and an 80% record against bottom 8 teams. But their proportions are reversed compared to the Blues'. The Roos played teams from the top 8 on 12 occasions and teams from the bottom 8 on only 10 occasions, and this difference in mix was just enough to have them finish on 11 wins, the same as the Blues.

(The relative difficulty of the Roos' draw when compared to the Blues' was also noted in that same blog.)

To borrow a topical term, does this give the Roos a "mandate" for a spot in the eight? Well they certainly have a stronger case for inclusion than the Blues' - particularly when you add in the fact that the Roos defeated the Blues 97-68 on the only occasion that they met this season -but does any team deserve a place in the 8 whose record against the finalists is no better than that of the teams that finished 13th and 14th?

I'd argue that neither Carlton nor the Roos truly deserve a spot in the eight - and neither does any other of the non-finalists. Leave them both out, I say, and give Sydney a bye in the 1st week of the finals.

(I did register some disappointment when the Roos missed a spot in the 8. For some time I've hoped that they would face Sydney in an important game one day and that they would pip the Swans in a tight contest on the basis of some clever tactical subterfuge. Next day, I imagined, an alert sub-editor would seize the opportunity and pen the following unforgettable headline: "Roos rues Roos' ruse". But, alas, that can never happen now.)

Sunday
Aug292010

Finalist v Finalist: Who Has the Best Record in 2010?

Twenty-three weeks of footy is over and the AFL's binary division has begun, with the sixteen teams now cleaved in two.

Let's take a look at how the finalists have performed when they've met one of their own.

Collingwood

They finished atop the ladder after overcoming what turned out to be the toughest draw amongst the finalists. In 12 of the home-and-away season's 22 rounds the Pies met another team from the top 8 - that's only two fewer than a maximum possible 14, and at least one more than every other team.

The Pies won 75% of these encounters and outscored their competitors by 28% in them, winning each contest by an average of almost four goals. As well, they recorded the best 1st, 2nd and 3rd quarter performances of all the finalists, blemishing their record only with a relatively poor 5 and 7 performance in final terms, ranking them 4th.

One other concern for Pies fans - aside from their alleged Collywobbledom - will be their scoring shot conversion rate (aka their ability to kick straight). At just 48.9%, their conversion rate is the poorest amongst the eight finalists. They've got away with this wastefulness by generating so many more scoring shots than their opponents - 30.2 per game, which is almost 3 shots per game more than Geelong, who are next best, and is at least 5 shots per game better than any other finalist.

Geelong

The Cats played other finalists 11 times this season, winning seven and losing four. Four of these contests took place in Rounds 18 to 21, during which Geelong went 3 and 1, which surely must have provided some level of confidence going into the season's main games.

In these 11 contests the Cats outscored their opponents by almost 25%, scoring about 20 points more than their rivals in each game. They've generated over 27 scoring shots per game but converted these at a relatively poor 54.5%, and conceded 24.5 scoring shots per game, which they've allowed their opponents to convert into goals at an impressively miserly rate of just 46.7% - the lowest amongst the finalists.

They've been strong across every quarter, though they have struggled on some occasions to generate points in the first term where, despite winning 64% of these quarters, overall they've actually been narrowly outscored by their opponents.

St Kilda

If you're a Saints fan, you've reasons to worry.

They met fellow-finalists on only nine occasions this season - the fewest of any finalist - and met none in the last five rounds. Perhaps more worrying is the fact that they've recorded a loss and a draw in their last two encounters with top 8 teams, meaning that they've not beaten a finalist since Round 14. 

Still, their early season form against the main contenders was strong - so strong in fact that, overall, they have the second best win-loss record of all eight teams in the finals.

I've noted before how much trouble the Saints have had scoring points and this affliction is very much in evidence in their performances against the other finalists. In these games the Saints have managed to score only 76 points per game, comfortably the worst performance of any of the teams. Their scoring deficiency has two causes: few scoring shots per game (21.0, also the worst amongst the finalists) and poor conversion (52.4%, better only than the Pies).

What's kept them in the hunt has been their defence. They've allowed just 21.3 scoring shots per game, which ranks them 1st on this statistic, and permitted their opponents to convert these scant opportunities at a rate of just 52.6%, which ranks them 3rd.

They've been slow starters in games against finalists, winning only one-third of 1st and 2nd terms while being outscored by 10-15% in each. Their second halves have been better, though not spectacular, which sees them rank 4th and 5th on quarter 3 and quarter 4 performances respectively.

Western Bulldogs

As you know, MARS Ratings suggest that the Dogs are being significantly underrated, but there's not a lot to support this assessment in a review of the Dogs' performance against other teams in the eight.

They've played 10 matches against other finalists, winning just six, and they've scored and conceded about 93 points per game in each. Most recently, they met finalists in Rounds 20 and 21, losing on both occasions, which is part of what's contributed to their significant rerating by the bookies.

Their statistics for scoring shot production, scoring shot conversion, and scoring shot concession all rank them mid-pack. What's hurt them has been the rate at which opponents have converted scoring shot opportunities: 58.5% of the time, easily the highest amongst their peers.

Similarly, their quarter-by-quarter performance is marred only by a single statistic - their final term results. They've won only 40% of final terms against finalists and been outscored by around 15%.

Sydney

Swans fans will take heart from their recent performances against the teams they're likely to meet in coming weeks.

Across Round 16 to 21 they met fellow-finalists on five occasions, winning four of these encounters and losing only one. Before this purple patch of form, the Swans had gone 0 and 6 against finalists, so their combined season record is an unremarkable 4 wins and 7 losses.

In these 11 encounters the Swans have been outscored by about 10%, scoring 85 points per game and conceding 94. On all the scoring related metrics - shots scored and conceded per game, own and opponent scoring shot conversion - they're ranked either 5th or 6th.

Their quarter-by-quarter performance is curious. In win-loss terms they've the worst 1st quarter performance of any of the teams remaining, having won 2 and lost 9 1st terms. However, they've narrowly outscored their opponents in these quarters.

In 3rd quarters their performance has been worse. They've won only a single 3rd term, drawn another, and lost the remaining nine, scoring 194 points and conceding 324 in doing so.

But they have the best final term record of all the finalists: won 8, lost 3, percentage 123. I'm not sure, though, that you want to make a habit of relying on barnstorming finishes in finals.

Fremantle

Freo have, I'd say, just done enough to get into the eight. Their MARS Rating is only 995.5, which makes them only the 3rd team in the last seven seasons to make the finals with a sub-1000 MARS Rating and the lowest-rated team to finish 6th on the competition ladder across all 12 seasons for which I've calculated MARS Ratings.

They've played 10 games against fellow-finalists, winning only four and being outscored by an average of about 22 points per game. Four of these contests came in the last six weeks of the season; they went 1 and 3 in these matches.

Fremantle has conceded more scoring shots per game (28.5) than any other team in the eight, though they've reduced the consequences of this profligacy somewhat by allowing these opportunities to be converted only 56.5% of the time - the third-best performance amongst all the finalists.

Their quarter-by-quarter performances have been fairly consistent and generally below average, brightened only a little by their 50% win-loss record and 104 percentage performance in 2nd terms.

Hawthorn

A glance at their performance record against fellow-finalists suggests that their 7th placed finish might be a tad misleading. Their 50% win-loss record is the 4th-best amongst the finalists and is underpinned by a 104 percentage in games against other teams in the eight.

They've performed well on two aspects of scoring performance: scoring shot production, where their 25.0 scoring shots per game ranks them 3rd, and opponent conversion rate, where their 49.1% rate ranks them 2nd.

But they've stumbled on the other two scoring dimensions. Their own conversion rate of 53.5% is only good enough for 5th spot, and their concession of 25.5 scoring shots per game ranks them 6th on this metric.

The Hawks have consistently started well in games involving other finalists. They boast a 7 and 4 record in 1st terms, which is 2nd best amongst all the finalists, and they've outscored their opponents 248-221 in these quarters. They've not generally been able to sustain this level of performance, however, and have won only about 40% of 2nd, 3rd and final terms, recording percentages of around 100 - a little less in the case of final terms - in doing so.

Carlton

Carlton have done reasonably well this year - whenever they've been playing a team from outside the eight.

Their record against top 8 teams is 2 and 8, with both wins recorded in the space of three weeks way back in Rounds 5 and 7. Since then the Blues have gone 0 and 7 against the finalists including back-to-back losses against fellow-finalists in Rounds 21 and 22.

During their 10 clashes with finalists, the Blues have been outscored by almost 4 goals per game as they've struggled both to create scoring shot opportunities (21.6 per game, ranked 7th) and prevent them for their opponents (27.9 per game, also ranked 7th). They have though been markedly respectful of the chances they've had, converting them at a rate of 56.9%, the highest rate amongst all the finalists. But they've also allowed their opponents to convert at a relatively high rate of 55.2%, ranking them 6th on this metric.

Quarters 1 through 3 have, on average, been best forgotten by Blues supporters. The win-loss percentages for the Blues for these quarters have been 30%, 20% and 40% respectively, and the scoring percentages of 75, 58 and 88 would read better were they cricketing rather than football related results.

The Blues have, though, generally finished well against their peers, winning 60% of final terms to rank 3rd on performances in this term, albeit by only scoring 3% more points than they've conceded.

Still, if you win a Grand Final by securing only 50.7% of the points ...

Sunday
Aug222010

Why Sydney Won't Finish Fourth

As the ladder now stands, Sydney trail the Dogs by 4 competition points but they have a significantly inferior percentage. The Dogs have scored 2,067 points and conceded 1,656, giving them a percentage of 124.8, while Sydney have scored 1,911 points and conceded 1,795, giving them a percentage of 106.5, some 18.3 percentage points lower.

If Sydney were to win next week against the Lions, and the Dogs were to roll over and play dead against the Dons, then fourth place would be awarded to the team with the better percentage. Barring something apocalyptic, that'll be the Dogs.

Here's why. A few blogs back I noted that you could calculate the change in a team's percentage resulting from the outcome of a single game by using the following expression:

(1) Change in Percentage = (%S - %C)/(1 + %C) * Old Percentage

where %S = the points scored by the team in the current game as a percentage of the points it had already scored in the season,
and %C = the points conceded by the team in the current game as a percentage of the points it had already conceded in the season.

Now at this stage of the season, a big win or loss for a team will be one where the difference between %S and %C is in the 6-8% range, bearing in mind that a single game now represents about 1/20th or 5% of the season, so a 'typical' %S or %C would be about 5%. Scoring twice as many points as 'expected' then would give a %S of 10%, and conceding half as many as 'expected' would give a %C of 2.5%, a difference of 7.5%.

Okay, so consider a big loss for the Dogs, say 30-150. That gives a (%S - %C) of around -7.5%, which (1) tells us means the Dogs' percentage will change by about -7.5%/1.1 x 1.25, which is 8.5 percentage points. That drops the Dogs' percentage to about 116.

Next consider a big win for the Swans, again say 150-30. For them, that's a (%S - %C) of 6%, which gives them a percentage boost of 6%/1.02 x 1.06, which is about 6 percentage points. That lifts their percentage to about 112.5, still 3.5 percentage points short of the Dogs'.

To completely close the gap, Sydney needs its percentage change plus the Dogs' to exceed 18.3 percentage points, the percentage chasm it currently faces. Using this fact and the expression in (1) above for both teams, you can derive the fact that, to lift its percentage above the Dogs', Sydney needs the following to be true:

Sydney's (%S - %C) > 18.3% - 1.15 times the Dogs' (%S - %C)

Now my worst case 30-150 loss for the Dogs gives them a (%S - %C) of -7.6%. That means Sydney needs its (%S - %C) to be about 9.5%. So even if Sydney were to concede no points at all to the Lions - making %C equal to 0 - they'd need to score about 180 points to achieve this.

More generally still, Sydney need the sum of their victory margin and the Dogs' margin of defeat to be around 300 points if they're to grab fourth.

Sydney won't finish fourth.

Monday
Aug162010

Letting the Computer Do (Most of) the Work

Around this time of year it's traditional to work through the remaining matches for each team and attempt to codify what each needs to do in order to secure a particular finish - minor premiership, top 4, top 8 or Spoon.

This year, rather than work through all the combinations manually, I've decided to be lazy - purely for instructional purposes, I should add - and enlist the help of rule induction, a mathematical technique for deducing from a dataset statements in the form If A and B then C that describe key variables in that data.

So, for example, if you were to apply the technique to help describe the use of heating and cooling appliances by a household over the course of a few years you might collect information several times each day about who was home, what the outside temperature was, what day of the week and time of day it was, and whether or not a heating or a cooling appliance was turned on.

Using a rule induction algorithm, you'd be able to come up with statements such as this one: 

  • If Number of People Home is greater than 0 AND Outside Temperature is less than 15 degrees AND Time of Day is between 5:30pm and 11:30pm AND Day of Week is not Saturday or Sunday then Heating = ON (Probability 92%)

For this blog I provided a rule induction algorithm (the JRip Weka algorithm running in R, if you're curious) with the outputs from 10,000 of the simulations I used in my earlier blog, which included for each simulation:

  • The results of each of the remaining 16 games
  • The final ladder positions of each team if these were the actual results of each game

To simplify matters a little, and recognising that the main interest is not in exact ladder position finishes, I summarised each team's finishing position as either "1st","2nd to 4th","5th to 8th","9th to 15th", or "16th".

The goal was that the rule induction algorithm would output rules of the form:

  • If X beats Y AND X beats Z AND ... then X finishes 5th to 8th

Rule induction worked remarkably well. Here are a few real examples of the rules that the algorithm offered up for Collingwood's fate:

  • Rule 1: (Collingwood..v..Adelaide <= 0) and (Hawthorn..v..Collingwood >= 0) and (Carlton..v..Geelong <= 0) => Collingwood = 2nd to 4th (168.0/2.0)
  • Rule 2: => Collingwood =1st (9832.0/0.0)

Rule 1 can be interpreted as follows: 

  • If Collingwood loses to or draws with Adelaide (ie the margin in that game, couched in terms of Collingwood is less than or equal to zero) AND Collingwood loses to or draws with Hawthorn AND Geelong beats or draws with Carlton then Collingwood finish 2nd to 4th.

What's implicit here is that Geelong also beats West Coast but since, in the simulations, this always occurred when the other conditions in the rule were met, the algorithm didn't realise that this was an additional required condition.

As well, Collingwood can't be allowed to draw both its games otherwise Geelong can't overhaul them. Again, this situation didn't occur in the simulations I provided the algorithm, and not even the smartest algorithm can intuit instances that it's never seen.

I could probably have fixed both of these shortcomings by providing the algorithm with more than 10,000 simulations, though I'd pay a price in terms of computation time. Note though the (168.0 / 2.0) annotation at the end of this rule. That tells you that the rule could be applied to 168 of the simulations, but that it was wrong for 2 of them. Maybe the two simulations for which the rule applied but was incorrect included a Geelong loss to the Eagles or two draws for Collingwood.

Rule creation algorithms include what's called a "stopping rule" to prevent them from creating a unique rule for every simulation result, which might make the rules highly accurate but also makes them completely impractical.

Rule 2 is the "otherwise" rule and is interpreted as the predicted outcome if none of the earlier rules' full set of conditions are met. For Collingwood, "otherwise" is that they finish 1st.

The rules provided for other teams were generally quite similar, although they became more complex for teams when percentages were required to determine crucial ladder positions. Here, for example, are a few of the rules where the algorithm is attempting to model Hawthorn getting bumped into 9th by Melbourne: 

  • (Hawthorn..v..Fremantle <= -7) and (Port.Adelaide..v..Melbourne = -14) and (Melbourne..v..Kangaroos >= 20) and (Hawthorn..v..Collingwood = -39) and (Melbourne..v..Kangaroos Hawthorn = 9th to 15th (54.0/2.0)
     
  • (Hawthorn..v..Fremantle <= -4) and (Port.Adelaide..v..Melbourne = -7) and (Melbourne..v..Kangaroos = 11) and (Hawthorn..v..Collingwood = -59) and (Port.Adelaide..v..Melbourne = -32) = Hawthorn = 9th to 15th (41.0/3.0)

Granted that's a mite convoluted, but nothing that a human can't recognise fairly quickly, which nicely illustrates my experience with this type of algorithm: their outputs almost always contain some useful insights but the extraction of this insight requires human interpretation.

What follows then are the rules that man and machine have crafted for each team (note that I've chosen to ignore the possibility of draws to reduce complexity)

Collingwood 

  • Finish 2nd to 4th if Collingwood lose to Adelaide and Hawthorn AND Geelong beat Carlton and West Coast
  • Otherwise finish 1st

Geelong

  • Finish 1st if Collingwood lose to Adelaide and Hawthorn AND Geelong beat Carlton and West Coast
  • Otherwise finish 2nd to 4th

 St Kilda

  • Finish 2nd to 4th

 Western Bulldogs

  • Finish 5th to 8th if Dogs lose to Essendon and to Sydney AND Fremantle beat Hawthorn and Carlton
  • Otherwise finish 2nd to 4th

 Fremantle

  • Finish 2nd to 4th if Dogs lose to Essendon and to Sydney AND Fremantle beat Hawthorn and Carlton
  • Otherwise finish 5th to 8th 

Carlton and Sydney

  • Finish 5th to 8th

Hawthorn 

  • Finish 9th to 15th if Hawthorn lose to Fremantle and Collingwood AND Roos beat West Coast and Melbourne
  • Also Finish 9th to 15th if Hawthorn lose to Fremantle and Collingwood AND Melbourne beat Port and Roos sufficient to raise Melbourne's percentage above Hawthorn's
  • Otherwise finish 5th to 8th

Kangaroos

  • Finish 5th to 8th if Hawthorn lose to Fremantle and Collingwood AND Roos beat West Coast and Melbourne
  • Otherwise finish 9th to 15th 

Melbourne 

  • Finish 5th to 8th if Hawthorn lose to Fremantle and Collingwood AND Melbourne beat Port and Roos sufficient to raise Melbourne's percentage above Hawthorn's
  • Otherwise finish 9th to 15th 

Adelaide, Port Adelaide and Essendon

  • Finish 9th to 15th 

Brisbane Lions 

  • Finish 16th if Lions lose to Essendon and Sydney AND West Coast beat Geelong and Roos sufficient to lift West Coast's percentage above the Lions' AND Richmond beat St Kilda or Port (or both)
  • Otherwise finish 9th to 15th 

Richmond 

  • Finish 16th if West Coast beat Geelong and Roos AND Richmond lose to St Kilda and Port Otherwise finish 9th to 15th 

West Coast 

  • Finish 9th to 15th if West Coast beat Geelong and Roos AND Richmond lose to St Kilda and Port Finish 9th to 15th if West Coast beat Geelong and Roos AND Lions lose to Essendon and Sydney sufficient to lift West Coast's percentage above the Lions'
  • Otherwise finish 16th

 

As a final comment I'll note that the rules don't allow for the possibility of Sydney or Carlton slipping into 4th. Although this is mathematically possible, it's so unlikely that it didn't occur in the simulations provided to the algorithm. (Actually, it didn't occur in any of the 100,000 simulations from which the 10,000 were chosen either.)

A quick bit of probability shows why.

Consider what's needed for Sydney to finish fourth.
1. The Dogs lose to Essendon and Sydney
2. Sydney also beat the Lions
3. Fremantle don't win both their games

Furthermore, combined, Sydney and the Dogs' results have to close the percentage gap between the two teams, which currently stands at over 25 percentage points.

But the 15% and 60% figures just relate to the probability of the required result, not the probability that the wins and losses will be big enough to lift Sydney's percentage above the Dogs'. If Sydney were to trounce the Lions by 100 points and Essendon were to do likewise to the Dogs, then Sydney would still need to beat the Dogs by about 91 points to achieve such a lift.

So let's revise the probability of 1 down to 0.01% (which is probably generous) and the probability of 2 down to 5% (which is also generous). Then the overall probability is 0.01% x 5% x 80%, or about 1 in 250,000. Not gonna happen.

(For similar reasons there are also no rules for Fremantle dropping a game but still grabbing 4th from the Dogs on the basis of a superior percentage.)