Answer to Question 1
For lack of a better term, normal regression analysis is the regression approach that includes all the independent variables at one time in the regression model. The regression equation is calculated such that the sum of squared errors (SSE) is minimized. However, there are other approaches for constructing the regression models. The forward selection is one in which we start with no x variables in the model. The first x variable added is the one most highly correlated with the dependent variable. Then, the next variable is the one that can do the most to explain the yet unexplained variation in the dependent variable given that the first variable is in the model. The process continues until either all x variables are included or until none of the remaining variables can meet the entering criteria.
The standard stepwise, also called forward stepwise regression model, enters variables in the same manner as the forward selection approach. The difference between the two is that with the standard stepwise approach, a variable that was entered on a previous step can actually be removed if its contribution is diminished after including other independent variables. This approach has an entry criteria and an exit criteria that the software checks at each step to determine which variables to add and which variables to remove.
The best subsets regression constructs the regression models for all combinations of regression models starting with all the models with one x variable, then all the possible models with two independent variables and so forth. Criteria such as highest R-square, lowest standard error of the estimate, and the Cp are used to determine which of the possible models is preferred.
Answer to Question 2
A