Skip to content

Hot Hand

idea that success rate of scoring is a pattern, and not random

originated from basketball

Reason

Previous success can change the psychological attitude of the player, and hence change the subsequent success rate

Research

There is research that supports Hot Hand.

But, there is also research suggesting that this is just a ‘fallacy’.

  • People have a tendency to try finding patterns within randomness
  • Misconception - law of large numbers apply to small samples

This could probably be because, if a player scores, the opponent would defend them better on the next play.

Thahir’s Opinion

I believe that the Hot Hand idea is true. As an athlete myself, my confidence goes up when I do make shots, and I end up scoring more subsequently, unless I get over-confident.

This is like Classical Economics (people are rational) vs Behavioral Economics (people are not always rational). Technically, player performance should be random, but since we are not mathematical machines, but humans with random complexities, the Hot Hand works.

Conditional Probability

\[ P(A|B) = \frac{ P(A \cap B) }{ P(B) } \]

Independence

If \(A\) and \(B\) are independent

  1. \(P(A \cap B) = P(A) \cdot P(B)\)
  2. \(P(A|B) = P(A)\)
  3. \(P(B|A) = P(B)\)

That is, occurence of A does not affect occurence of B, and vice-versa.

T Test

We can do a t test to determine if 2 distributions are different.

import scipy.stats as sp
sp.stats.ttest_ind(
    Player_Stats["conditional_prob"],
  Player_Stats["average_hit"],
)

The null hypothesis is that they are both are equal/similar. So, if the p value < 0.05, we can say the 2 distributions are different.

Autocorrelation (Serial Correlation)

Linear relationship between adjacent values of the same variable. $$ y_t = a + by_{t-1} $$ eg: Relationship between performance this year and performance last year.

Autocorrelation Coefficent

\[ r_a = \frac{ \text{cov}(x_t, x_{t-1}) }{ \sigma_{x_{\small{t}}} \sigma_{x_{\small{t-1}}} } \]

\(-1 \le r_a \le 1\)

\(r_a\) is independent of unit of measurement

\(r_a > 0\) \(r_a < 0\)
Above average follows Above average Below average
Below average follows Below average Above average
Nature of graph Smooth Zigzag

WLS Regression

Weighted Least Squares

Weights the observations proportional to the reciprocal of the error variance of the observation. Helps overcome the problem of non-constant variance.

For eg, if some players took 1000 shots, but others only took 10shots.

If we are weighting by the number of shots per game, then weight \(= \frac{1}{\text{Shots per Game}}\)

reg = sm.wls(
    formula = "error ~ lagerror + player_position",
  weight = 1/ShotLog["shots_per_game"],
  data = ShotLog
)

Notebooks