Ask Question
8 August, 04:28

IQ scores based on the Stanford-Binet IQ test are normally distributed with a mean of 100 and standard deviation 15. If you were to obtain 100 different simple random samples of size 20 from the population of all adult humans and determine 95% confidence intervals for each of them, how many of the intervals would you expect to include 100? One would expect 95 of the 100 intervals to include the mean 100.

+2
Answers (1)
  1. 8 August, 04:50
    0
    95 intervals

    Step-by-step explanation:

    Given that

    population mean = 100

    standard deviation = 15

    number of interval that involve the mean for 95% confidence interval is calculated as

    We know that when we measure the 99 percent confidence interval, 99 outof 100 confidence interval are required to provide the mean population.

    similarly

    Assuming we measure a confidence interval of 95 percent, then we should expect 95 out of 100 confidence interval to provide the mean population therefore, answer is 95
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “IQ scores based on the Stanford-Binet IQ test are normally distributed with a mean of 100 and standard deviation 15. If you were to obtain ...” in 📗 Mathematics if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers