Cộng và trừ sẽ cho
Vì vậy, chúng ta cần chỉ ra rằng∑ n i = 1
∑i=1n(yi−y¯)2==∑i=1n(yi−y^i+y^i−y¯)2∑i=1n(yi−y^i)2+2∑i=1n(yi−y^i)(y^i−y¯)+∑i=1n(y^i−y¯)2
∑ni=1(yi−y^i)(y^i−y¯)=0∑i=1n(yi−y^i)(y^i−y¯)=∑i=1n(yi−y^i)y^i−y¯∑i=1n(yi−y^i)
ei=yi−y^i∑ni=1(yi−y^i)y^i=0, and (b) the sum of the fitted values needs to be equal to the sum of the dependent variable,
∑ni=1yi=∑ni=1y^i.
Actually, I think (a) is easier to show in matrix notation for general multiple regression of which the single variable case is a special case:
e′Xβ^====(y−Xβ^)′Xβ^(y−X(X′X)−1X′y)′Xβ^y′(X−X(X′X)−1X′X)β^y′(X−X)β^=0
As for (b), the derivative of the OLS criterion function with respect to the constant (so you need one in the regression for this to be true!), aka the normal equation,
is
∂SSR∂α^=−2∑i(yi−α^−β^xi)=0,
which can be rearranged to
∑iyi=nα^+β^∑ixi
The right hand side of this equation evidently also is
∑ni=1y^i, as
y^i=α^+β^xi.