Ask Question
3 July, 19:29

The least squares regression line minimizes the sum of the:

(A) Differences between actual and predicted Y values.

(B) Absolute deviations between actual and predicted Y values.

(C) Absolute deviations between actual and predicted X values

(D) Squared differences between actual and predicted Y values

(E) Squared differences between actual and predicted X values.

+1
Answers (1)
  1. 3 July, 19:49
    0
    d) Squared differences between actual and predicted Y values.

    Step-by-step explanation:

    Regression is called "least squares" regression line. The line takes the form = a + b*X where a and b are both constants. Value of Y and X is specific value of independent variable. Such formula could be used to generate values of given value X.

    For example,

    suppose a = 10 and b = 7. If X is 10, then predicted value for Y of 45 (from 10 + 5*7). It turns out that with any two variables X and Y. In other words, there exists one formula that will produce the best, or most accurate predictions for Y given X. Any other equation would not fit as well and would predict Y with more error. That equation is called the least squares regression equation.

    It minimize the squared difference between actual and predicted value.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “The least squares regression line minimizes the sum of the: (A) Differences between actual and predicted Y values. (B) Absolute deviations ...” in 📗 Mathematics if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers