Were gamblers more accurate than the forecasting models in 2020?

Election forecasting models took a hit to their credibility for missing Donald Trump’s victory in 2016 but were generally accurate in 2020.  Because these models are dependent on polling, we can probably expect to see future issues with forecasting elections as pollster response rates drop, households disconnect their landlines, and social trust declines.  At the same time, prediction markets and election gambling continue to grow in popularity.  Have we reached the point where gambling is more accurate at predicting outcomes than election forecasting models?

I compared the FiveThirtyEight’s 2020 presidential election model output to PredictIt share prices on a state-by-state basis.  FiveThirtyEight is the most respected among the election forecasting models while PredictIt is the most popular prediction market in the United States (and one of the few places for Americans to legally gamble on election results).  To learn more about how PredictIt operates, please see their website or a previous post.

PredictIt and FiveThirtyEight did not just trade shares and forecast, respectively, in all 50 states and D.C., but also the five congressional districts that allocate electoral college votes: Maine-1, Maine-2, Nebraska-1, Nebraska-2, and Nebraska-3.  With two presidential candidates in 56 states/territory/districts, we end up with 112 predictions to measure.  I compared their predictions two ways:

  • Brier score to measure the difference between probabilities and the outcome.
  • Whether a state’s (or federal territory or congressional district) winner was correctly predicted by FiveThirtyEight or PredictIt, with a threshold of 0.5.

Comparing FiveThirtyEight’s model output and PredictIt gamblers on the morning of the 2020 election, FiveThirtyEight had a marginally lower – meaning more accurate – Brier Score than PredictIt:

PlatformBrier scoreState winner
PredictIt0.0405109 / 112
FiveThirtyEight0.0400106 / 112

However, PredictIt gamblers were slightly more accurate than Nate Silver in terms of predicting outcomes.  PredictIt called the correct state winner in 109 out of 112 predictions and FiveThirtyEight was correct in 106.  PredictIt gamblers only missed Georgia and had both Trump and Biden winning Arizona.  (PredictIt treats each candidate separately, so ‘Will the Democrat candidate win Arizona in the 2020 Presidential race?’ and ‘Will the Republican candidate win Arizona in the 2020 Presidential race?’ are two distinct wagers).  FiveThirtyEight incorrectly predicted three winners: North Carolina, Florida, and Maine-2.

Both sites predicted the same winner in 103 out of 112 predictions.  The state where the probabilities of PredictIt and FiveThirtyEight’s model differed by the greatest amount was Florida:

PlatformTrumpBiden
PredictIt0.6250.405
FiveThirtyEight0.3090.691
Margin0.3160.286

The state with the closest predicted probabilities between PredictIt and FiveThirtyEight was Trump’s probability in Utah.

PlatformTrump
PredictIt0.955
FiveThirtyEight0.9544
Margin0.0006

In the end, the similar predictions aren’t surprising considering PredictIt gamblers were operating off similar information as FiveThirtyEight in the 2020 Presidential Election, as well as having access to the FiveThirtyEight forecast itself.  And this raises the question that if gamblers and FiveThirtyEight are both making decisions based on the same polls, then does this indicate gamblers are interpreting the polls just as efficiently as the FiveThirtyEight model?

However, PredictIt gamblers shouldn’t feel too much pride in besting (in state-level outcomes) Nate Silver and FiveThirtyEight’s 2020 presidential election forecast; we’ve already seen how PredictIt gamblers are inferior to their British counterparts.