Ask Question
31 March, 02:17

A sample of 100 independent random numbers is taken from this distribution, and its average is used to estimate the mean of the distribution. What isthe standard error of this estimate?

+4
Answers (1)
  1. 31 March, 02:29
    0
    The standard error in estimating the mean = (0.1 * standard deviation of the distribution)

    Step-by-step explanation:

    The standard error of the mean, for a sample, σₓ is related to the standard deviation, σ, through the relation

    σₓ = σ / (√n)

    n = sample size = 100

    σₓ = σ / (√100)

    σₓ = (σ/10) = 0.1σ

    Hence, the standard error in estimating the mean = (0.1 * standard deviation of the distribution)
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “A sample of 100 independent random numbers is taken from this distribution, and its average is used to estimate the mean of the ...” in 📗 Mathematics if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers