What happens to the sample standard deviation when the sample size is increased?

Redirected from "Question #dc117"
1 Answer
Jan 22, 2017

It depends on the actual data added to the sample, but generally, the sample S.D. will approach the actual population S.D.

Explanation:

The formula for sample standard deviation is

#s=sqrt((sum_(i=1)^n (x_i-bar x)^2)/(n-1))#

while the formula for the population standard deviation is

#sigma=sqrt((sum_(i=1)^N(x_i-mu)^2)/(N-1))#

where

  • #n# is the sample size,
  • #N# is the population size,
  • #bar x# is the sample mean, and
  • #mu# is the population mean.

As #n# increases towards #N#, the sample mean #bar x# will approach the population mean #mu#, and so the formula for #s# gets closer to the formula for #sigma#.

Thus, as #n->N, s -> sigma#.

Note:

When #n# is small compared to #N#, the sample mean #bar x# may behave very erratically, darting around #mu# like an archer's aim at a target very far away. Adding a single new data point is like a single step forward for the archer—his aim should technically be better, but he could still be off by a wide margin. Thus, incrementing #n# by 1 may shift #bar x# enough that #s# may actually get further away from #sigma#.

It is only over time, as the archer keeps stepping forward—and as we continue adding data points to our sample—that our aim gets better, and the accuracy of #barx# increases, to the point where #s# should stabilize very close to #sigma#.