###
**1. Introduction: Assessing football clubs' quality**

Football results are extremely hard to predict. Due to the low-scoring nature of the game, surprising results are more common than in other sports, making the exact prediction of single matches' outcome increasingly difficult. On the long run however, quality tends to pay off, and better teams finish ahead of clubs with lower quality in the final classification. Although seasons with 34 or 38 matchdays as in the case of the two leagues studied hereafter are not long enough to completely rule out statistical noise and the role of coincidence and luck, tables based on them are certainly way more meaningful than single games. The fact that Barcelona were crowned Spanish champion did surely not surprise a lot of people, neither did Paderborn's relegation.

But how can the competing teams' quality be assessed? And how do these assessments transfer into predictions? There is no generally accepted measurement to examine individual players' or complete squads' footballing quality. Yet there are some intents of providing objective evaluations, which are also making their work public. Some of these are assessed in this blog entry and compared according to their ability to predict teams' performances correctly. The three measures I chose are Market Value, Goalimpact and Club Elo. The main reason for this choice was public availability and comparability over teams and leagues. These factors also ruled out all European football leagues but two, since I found data for the Goalimpact index only for the German Bundesliga and the Premier League. The number of observations is therefore limited to 38 (18 German and 20 English clubs from the 2014/15 season).

All of them share some features, but also have some different characteristics which in turn might influence their predictive power. Goalimpact and Market Value are assessments of individual players, whose values are averaged in order to calculate a value for the total team. Averaging has the problem that it is possibly a biasing step; the value contains no information on the composition of the team, hence two teams with equal values might in fact have very different preconditions to work with. Consider the extreme examples of one team which comprises players of exactly the same quality and one which value might be driven by outliers, especially if the squad is small. Market Value and Goalimpact are on the other side more flexible indicators than Elo, since their values are altered immediately if the overall quality of a team is changed through transfers. Elo itself is in my understanding not susceptible to transfers.

Goalimpact and especially Elo are based above all on past performances, i.e. results. Results themselves are no perfect, yet acceptable predicator for future results. Goalimpact tries to overcome this issue by taking players' age and their expected peak performance into account, Elo weights results according to opposition strength, which makes them a more sound assessment of quality. The other side of the coin is that the two are at least in theory more objective than Market Values which are based on users' subjective evaluations of players' quality and their future potential, although this weakness might be overcome by the wisdom of the crowd.

In spite of these differences, the values of the three indices for the 38 clubs in the sample correlate highly with each other. As can be seen in Graph 1, the strongest correlation exists between Market Value and Goalimpact. This fact might be due to the mentioned fact that they are both based on individual players' assessments which take into account past performances but also future potential. Although the correlations of these two with Club Elo is lower, it is still quite impressive (an r² of around 0.8).

*Graph 1*

In order to examine which of these three indicators works best to predict teams' sporting performances, the correlations of the respective values with the teams' points per game values are presented in the following graphs. Data therefore were gathered using the websites of the indicators. In order to avoid endogeneity, i.e. values influenced by the footballing performances themselves, timing have to be taken into account. Market Value data where therefore taken from October 2014. This value is arguably the most convincing, since it is the only one updated between the two transfer periods. Taking the value of July would bias results since a lot of transfers happened in the weeks before deadline, consider only Louis van Gaal's massive spending on the likes of Di María, Rojo and Blind, which raised Manchester United's average Market Value from 14.4 million € on July 10th to 16.4 on October 23th. Since the majority of games is played after transfer deadline, the market value after the last transfers should be a more precise predicator than an earlier one. Goalimpact data were collected from the link above, and the value taken for Club Elo was the one before the first league game of the season.

###
**2. Which indicator works best?**

Every of the following three graphs shows the relationship between one of the indicators (on the x-axis) and the teams' performance (points per game on the y-axis). The points are coloured differently according to the league each team belongs to; red spots represent German teems and green ones English teams. The grey shades represent 95% confidence intervals, the blue line an OLS regression. All three correlations are statistically highly significant (p<0.001 in each case).

Graph 2 shows the relationship between the Average Market Value and the sporting performance of the 38 teams during the 2014/15 season. The relationship is the second strongest and reaches an r² of 0.66. There are however also some interesting outliers: the teams which ended up highest above the regression line and outside the confidence interval are all German, meanwhile underperformers are predominantly English teams.

Club Elo turned out to be the best predictor for a club's league points. Does this mean that the other two indicators are worse? Well, not necessarily. The answer depends on what you are looking for. First of all, the results are just based on averages over all clubs. How one single club will finish is still not easy to predict. The fact that Club Elo has the highest correlation with points per game expresses only that the residuals, i.e. the differences between predicted and actual points, even out more than in the case of Average Market Value and Goalimpact. Secondly, if you are a bettor, you will most certainly not be interested in the average residual, but more in individual teams, for instance which team will conquer the title. In the case of the two leagues analysed here, only Average Market Value predicted both champions correctly. Meanwhile all indices assessed Bayern München as the strongest German club, Goalimpact and Club Elo saw Manchester City ahead of Chelsea. Then again, Club Elo did the best job in predicting relegated teams (four out of six; I count Hamburg as relegated, although they managed to avoid relegation through their win over Karlsruhe in the play off). The other two indicators only predicted two out of six relegated teams correctly. League position itself is also not only dependent on the performance of one particular team itself, but also on the clubs' this one is competing with. Therefore, individual final positions are an outcome even more difficult to predict. Thirdly, as stated above, Average Market Value and Goalimpact are average numbers of individual assessments. They provide information which Club Elo is completely silent about and are hence much more useful if you are looking for individual players' quality. Fourthly, a more general caveat is the sample of the analyses these results are based on. The size (38) is reasonably large to draw conclusions using methods of quantitative data analyses, although a larger size would of course be desirable. One possible but unlikely danger is that the sample is biased, i.e. the fact that only English and German teams are included influences the results of analyses. Repeating the analyses with the inclusion of clubs from more countries would therefore enlighten the question even more and make results more robust.

Graph 2 shows the relationship between the Average Market Value and the sporting performance of the 38 teams during the 2014/15 season. The relationship is the second strongest and reaches an r² of 0.66. There are however also some interesting outliers: the teams which ended up highest above the regression line and outside the confidence interval are all German, meanwhile underperformers are predominantly English teams.

*Graph 2*

The highest ordered teams are then again assessed with great precision and all ended within the confidence interval, i.e. they performed within the expected range of points per game. One conclusion of this might be that low-estimated teams are just as likely to over- and underperform globally, although different mechanisms seem to be at work in the two countries. The best teams perform more or less exactly as you would expect them, although some had disappointing years (Liverpool, Dortmund). Then again it is generally very unlikely to overperform if expectations are already very high; if data predicts that your team will collect around 2.2 or 2.3 points per game, there is little range to finish even better, hence we should not be too surprised that none of the high quality teams in terms of Market Value finished high above the regression line in Graph 2.

*Graph 3*

Goalimpact as shown in Graph 3 displays some similar characteristics, although the values are much more dispersed, i.e. standard deviation is much higher than in the case of Market Value. The level of correlation is slightly lower than in the former case and hence the lowest one of the three indicators, although the relationship is still very strong in overall terms. There is however not such a clear relationship between the league of a club and the fact that it finished better or worse than predicted; over- and underperformers are to be found in both leagues. Similarly, there is no relationship between a clubs level of Goalimpact and its over- or underperformance, as it can be found in the case of Market Value, although the differences between expected and actual performance are again higher for lower quality teams.

*Graph 4*

Finally, Club Elo (see Graph 4) turned out the best predicator for final performance during the 2014/15 season in England and Germany. The correlation reaches an r² level of 0.71. The final performance in terms of points per game is well predicted especially for teams with very high and very low Elo values, although there are some mid-range teams which over- or underperformed expectations significantly (plus Chelsea, which finished far over the regression line and confidence interval).

The advantage of Club Elo over the other indicators persists even if we take all of them into one single equation. For this purpose, a multivariate regression was calculated, with all three indicators as independent variables explaining the outcome, i.e. points per game. This procedure enables us to directly compare the explanatory power of every index over the others, but also to check whether one index evens out omissions of others. Results (which are displayed in Graph 5) show that Club Elo is the only indicator which remains significant if the other two indicators are controlled for. This means that the variance of teams' performance outcomes not explained by Club Elo is likely not a result of shortcomings of the index itself, but rather due to actual over- or underperformance of teams. Note also that the adjusted r², i.e. the overall predictive power of the model, is around 0.7, which means that Market Value and Goalimpact do not offer additional predictive value.

*Graph 5*

### 3. Which teams are the outliers?

Based on these analyses, we can take the issue to the next level and see which teams actually performed the way the underlying numbers before the season would expect them to and which ones did not. In order to do so, I took the strongest predicator Club Elo and present the difference between the predicted value of points per game and the actual points per game each team achieved (in technical terms, these differences are called residuals). Graph 6 shows each team's difference between expected and actual points per games. Teams are ordered according to the size of the difference, and dots are also coloured according to this size.

*Graph 6*

As can be seen, overperforming actually pays off. The highest overperformers Wolfsburg and Mönchengladbach qualified directly for the Champions League group stage. Third in the list is Chelsea, which even one the title, by collecting more than 0.3 more points per game than expected (in total terms, this accounts for 12 points more in the whole season than predicted, without which Chelsea would have finished second or third, depending on goal difference).

On the other end, three of the four worst achievers also made it into European competitions. Dortmund will participate in the Europa League in spite of gaining a total of twenty points less than predicted. This shows that grave underachievement is a real threat for highly- but not highest assessed teams. The next underachievers were however all relegated (Freiburg, Burnley, QPR) or in severe danger (Hannover, Sunderland).

Over- or underachievement itself is in any case not related to overall season results. Consider Bayern München and Paderborn, which are among the teams with the lowest residuals. Meanwhile Bayern won the league some weeks in advance of the final matchday, Paderborn finished last and were relegated, i.e. were not able to upset predictions.

### 4. Conclusions

Club Elo turned out to be the best predictor for a club's league points. Does this mean that the other two indicators are worse? Well, not necessarily. The answer depends on what you are looking for. First of all, the results are just based on averages over all clubs. How one single club will finish is still not easy to predict. The fact that Club Elo has the highest correlation with points per game expresses only that the residuals, i.e. the differences between predicted and actual points, even out more than in the case of Average Market Value and Goalimpact. Secondly, if you are a bettor, you will most certainly not be interested in the average residual, but more in individual teams, for instance which team will conquer the title. In the case of the two leagues analysed here, only Average Market Value predicted both champions correctly. Meanwhile all indices assessed Bayern München as the strongest German club, Goalimpact and Club Elo saw Manchester City ahead of Chelsea. Then again, Club Elo did the best job in predicting relegated teams (four out of six; I count Hamburg as relegated, although they managed to avoid relegation through their win over Karlsruhe in the play off). The other two indicators only predicted two out of six relegated teams correctly. League position itself is also not only dependent on the performance of one particular team itself, but also on the clubs' this one is competing with. Therefore, individual final positions are an outcome even more difficult to predict. Thirdly, as stated above, Average Market Value and Goalimpact are average numbers of individual assessments. They provide information which Club Elo is completely silent about and are hence much more useful if you are looking for individual players' quality. Fourthly, a more general caveat is the sample of the analyses these results are based on. The size (38) is reasonably large to draw conclusions using methods of quantitative data analyses, although a larger size would of course be desirable. One possible but unlikely danger is that the sample is biased, i.e. the fact that only English and German teams are included influences the results of analyses. Repeating the analyses with the inclusion of clubs from more countries would therefore enlighten the question even more and make results more robust.

*Tip: Click on graphs to enlarge them.*

## Keine Kommentare:

## Kommentar veröffentlichen