# What are variance statistics

## Is R-Squared Really an Invalid Metric for Nonlinear Models?

I've read that the R-square is invalid for nonlinear models because the relationship that SSR + SSE = SSTotal no longer holds. Can someone explain why that is?

SSR and SSE are just the quadratic norms of the regression and residual vectors whose non-h components are (Y.ich ^ - -Y.¯) and (Y.ich- -Y.ich ^), respectively. As long as these vectors are orthogonal to one another, shouldn't the above relationship always hold, regardless of the type of function used to map predictor values ​​to fitted values?

Also, the regression and associated residual vectors should not be any of the least squares Model to be orthogonal by least squares definition? The remainder vector is the difference between the vector (Y.ich- -Y.ich¯) and the regression vector. If the regression vector is such that the remainder / difference vector is not orthogonal to it, the regression vector can be multiplied by a constant so that it is now orthogonal to the remainder / difference vector. This should also reduce the norm of the residual / difference vector.

If I explained it badly, please tell me and I'll try to clear it up.