EZ Statistics
EZStat

Law of Large Numbers Simulation

Simulations

Coin Flip Simulation

Parameters

Current Probability:0.00%
Expected Probability:50.00%

Results

Understanding the Results

  • The blue line shows the probability of getting heads as trials increase
  • The red dashed line shows the theoretical probability (50%)
  • Notice how the experimental probability converges to 50% over time
  • This demonstrates the Law of Large Numbers in action

Dice Roll Simulation

Parameters

Current Average:0.00
Expected Average:3.50

Results

Understanding the Results

  • The blue line shows the average dice roll value over time
  • The red dashed line shows the expected average (3.5)
  • Notice how the experimental average converges to 3.5 as trials increase
  • This demonstrates how sample means converge to the true population mean

Card Draw Simulation

Parameters

Current Probability:0.00%
Expected Probability:7.69%

Results

Understanding the Results

  • The blue line shows the probability of drawing an ace over time
  • The red dashed line shows the theoretical probability (4/52 ≈ 7.69%)
  • Notice how the experimental probability converges to 7.69% as trials increase
  • This demonstrates the Law of Large Numbers for a less intuitive probability

Learn More

The Law of Large Numbers: Definition, Formula, and Applications

What is the Law of Large Numbers?

The Law of Large Numbers is a fundamental principle in probability theory and statistics that describes how the average of results obtained from repeating an experiment many times will converge to the expected value. Think of it as nature's way of revealing its true probabilities through repeated observations.

"As the number of trials increases, the experimental probability approaches the theoretical probability."

Mathematical Foundation

Consider a sequence of independent and identical trials X1,X2,...,XnX_1, X_2, ..., X_n with expected value μ\mu. The sample average is:

Xˉn=1ni=1nXi\bar{X}_n = \frac{1}{n}\sum_{i=1}^n X_i

As the number of trials (nn) approaches infinity:

P(Xˉnμ<ϵ)1P(|\bar{X}_n - \mu| < \epsilon) \to 1

This means the probability that our sample average differs from the true mean by any small amount (ϵ\epsilon) approaches zero as we increase our sample size.

Key Principles

Convergence

The average results of repeated random trials will stabilize around the expected value as the number of trials increases.

Independence

Each trial must be independent - the outcome of one trial cannot influence the outcomes of other trials.

Sample Size Effect

Larger sample sizes lead to more reliable and precise estimates of the true probability.

Variability

While individual results may vary significantly, the average becomes more stable with more trials.

Real-World Applications

Business & Finance
  • Insurance premium calculations
  • Investment risk assessment
  • Quality control in manufacturing
Research & Science
  • Clinical trial design
  • Experimental validation
  • Population studies
Daily Life
  • Weather predictions
  • Political polling
  • Sports analytics

Understanding Our Simulations

The three interactive simulations above demonstrate different aspects of the Law of Large Numbers:

Coin Flip Simulation

Shows convergence to a simple theoretical probability of 0.5 (50%) for heads.

Dice Roll Simulation

Demonstrates convergence to the expected value of 3.5 for a fair six-sided die.

Card Draw Simulation

Illustrates convergence for more complex probability scenarios involving card draws.