Missed goal: Germany’s fate in the 2018 World Cup demonstrates the limits of AI

Tim Gordon
Good Audience
Published in
4 min readJun 28, 2018

--

Ten things we can learn from the Artificial Intelligence (AI) predictions about the World Cup

It is hard to understate the shock. Germany is heading home, humiliated by a 2–0 defeat by South Korea. This a mere four years after they beat Brazil 7–1 in Brazil to win the 2018 World Cup. There will doubtless be questions to answer.

Image from the 2015 Robot Football World Cup (image source: @HowWeGetToNext)

But it was not only the fans who expected Germany to do well. At least two separate organisations produced artificial intelligence (AI) models that predicted that Germany would make it to the finals.

According to Vice a team from the German Technische Universitat of Dortmund, the Technical University in Munich and the Ghent University in Belgium crunched 100,000 scenarios to predict that Germany had the best chance of winning. Goldman Sachs, never knowingly outdone, crunched 1 million scenarios to come up with a Brazil win — after a final against Germany.

Unless you followed their advice to a betting site no damage done. But not dissimilar machine learning technology is being used every day to determine things in fields that do matter to you — and will be even more so in future as AI takes off. So what might this teach us about the practical application of AI?

  1. AI makes predictions. To massively over-simplify it looks for patterns in data and then uses these to predict forward what pattern recognition would suggest might come next. The computer has not understood what it is processing — it has simply predicted the next potential fact.
  2. Probability is key: any data scientist will always refer to model outcomes in terms of probabilities and confidence levels against these. Certainty is not their trademark. In the academic paper Germany had an 86.5% chance of making it out of the Group rounds and a 17.1% chance of winning (against Spain at 17.8%).
  3. Most humans don’t process probabilities well: most of us like to see things in binary terms — its ‘yes’ or ‘no’. Shades of grey, coloured in by maths, require considered thinking. So a 17.8% chance of victory becomes “the predicted winner”.
  4. This is reflected in our very language about decision-making and predictions. No politician ever won an election discussing the probabilities of something happening. We like and respond to certainty.
  5. It is possible to make predictions when when the factors and process can be clearly laid out and it is ultimately a closed loop system. Everything can be modelled and laid for a computer to process. It is worth noting that the academic study got the broad results of 6 of the 8 groups right — or at least the scenarios with the highest probability ended up occuring.
Germany’s projected route to victory (Image source: Vice)

6. The data that is available will get priority, and this (obviously) drives the output: the academic team crunched “the FIFA rankings, each country’s population and their gross domestic product (GDP), bookmakers’ odds, how many of the national team players play together in a club, the player’s average age, and how many Champions Leagues they’ve won.” This data is available for all the teams on a standardised and comparable format, all presumably clean and appropriately labelled. Hence it was where the focus went.

7. But there is always more data than can be processed. What is not included here is effectively unlimited — how the players slept, personal chemistry, the state of the grass, impact of fan’s enthusiasm on the day… what gets processed is what is available. If there is no absolute formula for what makes a football team win then it becomes impossible to model. This will not necessarily be true of many processes to which AI is applied — think optimisation of a factory process to make the perfect widget.

8. However, unexpected events can throw the smartest model — and indeed make its entire premise pointless or even harmful. If a fire damaged a portion of the widget factory then an AI management system would no longer be able to cope.

9. This is why humans will be needed in jobs even as the machines take over more and more roles. Complexity and dynamic change and even the common sense required to deal with them remain major barriers to machine success.

10. The media can over-sell any story involving AI. The drum roll on AI stories often takes what is essentially an Excel spreadsheet on steroids and confers mysterious, magical powers because it is “AI”.

As an England fan this leaves me with only one real concern — Goldman Sachs’ updated model suggests that England will make it to the Final.

Connect with the Raven team on Telegram

--

--

A little bit of politics, a little bit of AI. Co-Founder of Best Practice AI (bestpractice.ai), ex-various things