Answer :

Answer:

False. See expplanation below.

Explanation:

Training error by definition is the "error that you get when you run the trained model back on the training data."

False

Sometimes if we have more predictors than the neccesary we create bias and other problems like multicolinearity between the independnet variables. The idea is have a parsimonious model with the ideal number of variables and not with too much or too low variables.

For example we can have a linear model with just one predictor adjusted to the response variable perfect. And we can have another model with the same response variable but with 10 predictors with the same correlation and significance.

Always is important to understand the context of a problem in order to select the predictors to estimate the response variable in order to don't overestimate the number of parameters neccesary to use.

Other Questions