Ch13 Test Bank Docx Multiple Regression Analysis - Business Stats Contemporary Decision 10e | Test Bank by Ken Black by Ken Black. DOCX document preview.
File: Ch13, Chapter 13: Multiple Regression Analysis
True/False
1. Regression analysis with two dependent variables and two or more independent variables is called multiple regression.
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
2. The model y = 0 + 1x1 + 2x2 + is a second-order regression model.
Response: See section 13.1 The Multiple Regression Model
Difficulty: Medium
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns
3. The model y = 0 + 1x1 + 2x2 + 3x3 + is a first-order regression model.
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns
4. In the multiple regression model y = 0 + 1x1 + 2x2 + 3x3 + , the coefficients of the x variables are called partial regression coefficients.
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns
5. In the model y = 0 + 1x1 + 2x2 + 3x3 + y is the independent variable.
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns
6. In a multiple regression model, the partial regression coefficient of an independent variable represents the increase in the y variable when that independent variable is increased by one unit if the values of all other independent variables are held constant.
Response: See section 13.1 The Multiple Regression Model
Difficulty: Medium
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns
7. In the model y = 0 + 1x1 + 2x2 + 3x3 + is a constant.
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
8. A slope in a multiple regression model is known as a partial slope because it ignores the effects of other explanatory variables.
Response: See section 13.1 The Multiple Regression Model
Difficulty: Hard
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
9. Multiple t-tests are used to determine whether the independent variables in the regression model are significant.
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
10. The F test is used to determine whether the overall regression model is significant.
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
11. The F value that is used to test for the overall significance of a multiple regression model is calculated by dividing the mean square regression (MSreg) by the mean square error (MSerr).
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
12. The F value that is used to test for the overall significance of a multiple regression model is calculated by dividing the sum of mean squares regression (SSreg) by the sum of squares error (SSerr).
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
13. The mean square error (MSerr) is calculated by dividing the sum of squares error (SSerr) by the number of observations in the data set (N).
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Medium
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
14. The mean square error (MSerr) is calculated by dividing the sum of squares error (SSerr) by the number of degrees of freedom in the error (dferr).
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
15. In a multiple regression analysis with N observations and k independent variables, the degrees of freedom for the residual error is given by (N – k – 1).
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Medium
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
16. In a multiple regression analysis with N observations and k independent variables, the degrees of freedom for the residual error is given by (N – k).
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Medium
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
17. If we reject H0: β1= β2=0 using the F-test, then we should conclude that both slopes are different from zero.
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Hard
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
18. The standard error of the estimate of a multiple regression model is essentially the standard deviation of the residuals for the regression model.
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Easy
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
19. The standard error of the estimate of a multiple regression model is computed by taking the square root of the SSE divided by the degrees of freedom of error for the model.
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
20. In a multiple regression model, the proportion of the variation of the dependent variable, y, accounted for the independent variables in the regression model is given by the coefficient of multiple correlation.
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
21. The value of R2 always goes up when a nontrivial explanatory variable is added to a regression model.
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
22. The value of adjusted R2 always goes up when a nontrivial explanatory variable is added to a regression model.
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model
23. Minitab and Excel output for a multiple regression model show the F test for the overall model, but do not provide the t tests for the regression coefficients.
Response: See section 13.4 Interpreting Multiple Regression Computer Output.
Difficulty: Easy
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
24. Minitab and Excel output for a multiple regression model show the t tests for the regression coefficients but do not provide a t test for the regression constant.
Response: See section 13.4 Interpreting Multiple Regression Computer Output.
Difficulty: Easy
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
Multiple Choice
25. A cost accountant is developing a regression model to predict the total cost of producing a batch of printed circuit boards as a linear function of batch size (the number of boards produced in one lot or batch), production plant (Kingsland, and Yorktown), and production shift (day, and evening). The response variable in this model is ______.
a) batch size
b) production shift
c) production plant
d) total cost
e) variable cost
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
26. A cost accountant is developing a regression model to predict the total cost of producing a batch of printed circuit boards as a linear function of batch size (the number of boards produced in one lot or batch), production plant (Kingsland, and Yorktown), and production shift (day, and evening). In this model, "shift" is ______.
a) a response variable
b) an independent variable
c) a quantitative variable
d) a dependent variable
e) a constant
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
27. A cost accountant is developing a regression model to predict the total cost of producing a batch of printed circuit boards as a linear function of batch size (the number of boards produced in one lot or batch), production plant (Kingsland, and Yorktown), and production shift (day, and evening). In this model, "batch size" is ______.
a) a response variable
b) an indicator variable
c) a dependent variable
d) a qualitative variable
e) an independent variable
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
28. A market analyst is developing a regression model to predict monthly household expenditures on groceries as a function of family size, household income, and household neighborhood (urban, suburban, and rural). The response variable in this model is _____.
a) family size
b) expenditures on groceries
c) household income
d) suburban
e) household neighborhood
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
29. A market analyst is developing a regression model to predict monthly household expenditures on groceries as a function of family size, household income, and household neighborhood (urban, suburban, and rural). The "neighborhood" variable in this model is ______.
a) an independent variable
b) a response variable
c) a quantitative variable
d) a dependent variable
e) a constant
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
30. A market analyst is developing a regression model to predict monthly household expenditures on groceries as a function of family size, household income, and household neighborhood (urban, suburban, and rural). The "income" variable in this model is ____.
a) an indicator variable
b) a response variable
c) a qualitative variable
d) a dependent variable
e) an independent variable
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
31. A human resources analyst is developing a regression model to predict electricity plant manager compensation as a function of production capacity of the plant, number of employees at the plant, and plant technology (coal, oil, and nuclear). The response variable in this model is ______.
a) plant manager compensation
b) plant capacity
c) number of employees
d) plant technology
e) nuclear
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
32. A human resources analyst is developing a regression model to predict electricity plant manager compensation as a function of production capacity of the plant, number of employees at the plant, and plant technology (coal, oil, and nuclear). The "plant technology" variable in this model is ______.
a) a response variable
b) a dependent variable
c) a quantitative variable
d) an independent variable
e) a constant
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
33. A human resources analyst is developing a regression model to predict electricity plant manager compensation as a function of production capacity of the plant, number of employees at the plant, and plant technology (coal, oil, and nuclear). The "number of employees at the plant" variable in this model is ______.
a) a qualitative variable
b) a dependent variable
c) a response variable
d) an indicator variable
e) an independent variable
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
34. A real estate appraiser is developing a regression model to predict the market value of single-family residential houses as a function of heated area, number of bedrooms, number of bathrooms, age of the house, and central heating (yes, no). The response variable in this model is _______.
a) heated area
b) number of bedrooms
c) market value
d) central heating
e) residential houses
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
35. A real estate appraiser is developing a regression model to predict the market value of single-family residential houses as a function of heated area, number of bedrooms, number of bathrooms, age of the house, and central heating (yes, no). The "central heating" variable in this model is _______.
a) a response variable
b) an independent variable
c) a quantitative variable
d) a dependent variable
e) a constant
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
36. The multiple regression formulas used to estimate the regression coefficients are designed to ________________.
a) minimize the total sum of squares (SST)
b) minimize the sum of squares of error (SSE)
c) maximize the standard error of the estimate
d) maximize the p-value for the calculated F value
e) minimize the mean error
Response: See section 13.1 The Multiple Regression Model
Difficulty: Medium
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
37. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 616.6849 | 154.5534 | 3.990108 | 0.000947 |
x1 | -3.33833 | 2.333548 | -1.43058 | 0.170675 |
x2 | 1.780075 | 0.335605 | 5.30407 | 5.83E-05 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 121783 | 60891.48 | 14.76117 | 0.000286 |
Residual | 15 | 61876.68 | 4125.112 | ||
Total | 17 | 183659.6 |
The regression equation for this analysis is ____________.
a) ŷ = 616.6849 + 3.33833 x1 + 1.780075 x2
b) ŷ = 154.5535 - 1.43058 x1 + 5.30407 x2
c) ŷ = 616.6849 - 3.33833 x1 - 1.780075 x2
d) ŷ = 154.5535 + 2.333548 x1 + 0.335605 x2
e) ŷ = 616.6849 - 3.33833 x1 + 1.780075 x2
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknown.
38. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 616.6849 | 154.5534 | 3.990108 | 0.000947 |
x1 | -3.33833 | 2.333548 | -1.43058 | 0.170675 |
x2 | 1.780075 | 0.335605 | 5.30407 | 5.83E-05 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 121783 | 60891.48 | 14.76117 | 0.000286 |
Residual | 15 | 61876.68 | 4125.112 | ||
Total | 17 | 183659.6 |
The sample size for this analysis is ____________.
a) 19
b) 17
c) 34
d) 15
e) 18
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknown
39. A multiple regression analysis produced the following tables.
For x1= 360 and x2 = 220, the predicted value of y is ____________.
a) 1314.70
b) 1959.71
c) 1077.58
d) 2635.19
e) 2265.57
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknown
40. A multiple regression analysis produced the following tables.
The regression equation for this analysis is ____________.
a) ŷ = 1959.71 + 0.46 x1 + 2.16 x2
b) ŷ = 1959.71 - 0.46 x1 + 2.16 x2
c) ŷ = 1959.71 - 0.46 x1 - 2.16 x2
d) ŷ =1959.71 + 0.46 x1 - 2.16 x2
e) ŷ =- 0.46 x1 – 2.16 x2
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknown
41. A multiple regression analysis produced the following tables.
The sample size for this analysis is ____________.
a) 12
b) 15
c) 17
d) 18
e) 24
Response: See section 13.1 The Multiple Regression Model
Difficulty: Easy
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknown.
42. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 616.6849 | 154.5534 | 3.990108 | 0.000947 |
x1 | -3.33833 | 2.333548 | -1.43058 | 0.170675 |
x2 | 1.780075 | 0.335605 | 5.30407 | 5.83E-05 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 121783 | 60891.48 | 14.76117 | 0.000286 |
Residual | 15 | 61876.68 | 4125.112 | ||
Total | 17 | 183659.6 |
Using = 0.01 to test the null hypothesis H0: 1 = 2 = 0, the critical F value is ____.
a) 8.68
b) 6.36
c) 8.40
d) 6.11
e) 3.36
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
43. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 616.6849 | 154.5534 | 3.990108 | 0.000947 |
x1 | -3.33833 | 2.333548 | -1.43058 | 0.170675 |
x2 | 1.780075 | 0.335605 | 5.30407 | 5.83E-05 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 121783 | 60891.48 | 14.76117 | 0.000286 |
Residual | 15 | 61876.68 | 4125.112 | ||
Total | 17 | 183659.6 |
Using = 0.05 to test the null hypothesis H0: 1 = 0, the critical t value is ____.
a) ± 1.753
b) ± 2.110
c) ± 2.131
d) ± 1.740
e) ± 2.500
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
44. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 616.6849 | 154.5534 | 3.990108 | 0.000947 |
x1 | -3.33833 | 2.333548 | -1.43058 | 0.170675 |
x2 | 1.780075 | 0.335605 | 5.30407 | 5.83E-05 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 121783 | 60891.48 | 14.76117 | 0.000286 |
Residual | 15 | 61876.68 | 4125.112 | ||
Total | 17 | 183659.6 |
These results indicate that ____________.
a) none of the predictor variables are significant at the 5% level
b) each predictor variable is significant at the 5% level
c) x1 is significant at the 5% level
d) x2 is significant at the 5% level
e) the intercept is not significant at 5% level
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Medium
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
45. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 752.0833 | 336.3158 | 2.236241 | 0.042132 |
x1 | 11.87375 | 5.32047 | 2.231711 | 0.042493 |
x2 | 1.908183 | 0.662742 | 2.879226 | 0.01213 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 203693.3 | 101846.7 | 6.745406 | 0.010884 |
Residual | 12 | 181184.1 | 15098.67 | ||
Total | 14 | 384877.4 |
Using = 0.05 to test the null hypothesis H0: 1 = 2 = 0, the critical F value is ____.
a) 3.74
b) 3.89
c) 4.75
d) 4.60
e) 2.74
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
46. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 752.0833 | 336.3158 | 2.236241 | 0.042132 |
x1 | 11.87375 | 5.32047 | 2.231711 | 0.042493 |
x2 | 1.908183 | 0.662742 | 2.879226 | 0.01213 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 203693.3 | 101846.7 | 6.745406 | 0.010884 |
Residual | 12 | 181184.1 | 15098.67 | ||
Total | 14 | 384877.4 |
Using = 0.10 to test the null hypothesis H0: 2 = 0, the critical t value is ____.
a) ±1.345
b) ±1.356
c) ±1.761
d) ±2.782
e) ±1.782
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
47. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 752.0833 | 336.3158 | 2.236241 | 0.042132 |
x1 | 11.87375 | 5.32047 | 2.231711 | 0.042493 |
x2 | 1.908183 | 0.662742 | 2.879226 | 0.01213 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 203693.3 | 101846.7 | 6.745406 | 0.010884 |
Residual | 12 | 181184.1 | 15098.67 | ||
Total | 14 | 384877.4 |
These results indicate that ____________.
a) none of the predictor variables are significant at the 5% level
b) each predictor variable is significant at the 5% level
c) x1 is the only predictor variable significant at the 5% level
d) x2 is the only predictor variable significant at the 5% level
e) the intercept is not significant at the 5% level
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
48. A multiple regression analysis produced the following tables.
Using = 0.01 to test the model, these results indicate that ____________.
a) at least one of the regression variables is a significant predictor of y
b) none of the regression variables are significant predictors of y
c) y cannot be sufficiently predicted using these data
d) y is a good predictor of the regression variables in the model
e) the y intercept in this model is the best predictor variable
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
49. A multiple regression analysis produced the following tables.
Using = 0.05 to test the null hypothesis H0: 1 = 0, the correct decision is ____.
a) fail to reject the null hypothesis
b) reject the null hypothesis
c) fail to reject the alternative hypothesis
d) reject the alternative hypothesis
e) there is not enought information provided to make a decision
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
50. A multiple regression analysis produced the following tables.
Using = 0.05 to test the null hypothesis H0: 2 = 0, the correct decision is ____.
a) fail to reject the null hypothesis
b) reject the null hypothesis
c) fail to reject the alternative hypothesis
d) reject the alternative hypothesis
e) there is not enought information provided to make a decision
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
51. A multiple regression analysis produced the following tables.
These results indicate that ____________.
a) none of the predictor variables are significant at the 10% level
b) each predictor variable is significant at the 10% level
c) x1 is significant at the 10% level
d) x2 is significant at the 10% level
e) the intercept is not significant at 10% level
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Medium
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
52. In regression analysis, outliers may be identified by examining the ________.
a) coefficient of determination
b) coefficient of correlation
c) p-values for the partial coefficients
d) residuals
e) R-squared value
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Easy
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
53. The following ANOVA table is from a multiple regression analysis with n = 35 and four independent variables.
Source | df | SS | MS | F | p |
Regression | 700 | ||||
Error | |||||
Total | 1000 |
The number of degrees of freedom for this regression is __________.
a) 1
b) 4
c) 34
d) 30
e) 35
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Easy
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
54. The following ANOVA table is from a multiple regression analysis with n = 35 and four independent variables.
Source | df | SS | MS | F | p |
Regression | 700 | ||||
Error | |||||
Total | 1000 |
The number of degrees of freedom for error is __________.
a) 1
b) 4
c) 34
d) 30
e) 35
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Easy
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
55. The following ANOVA table is from a multiple regression analysis with n = 35 and four independent variables.
Source | df | SS | MS | F | p |
Regression | 700 | ||||
Error | |||||
Total | 1000 |
The MSR value is __________.
a) 700.00
b) 350.00
c) 233.33
d) 175.00
e) 275.00
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
56. The following ANOVA table is from a multiple regression analysis with n = 35 and four independent variables.
Source | df | SS | MS | F | p |
Regression | 700 | ||||
Error | |||||
Total | 1000 |
The MSE value is __________.
a) 8.57
b) 8.82
c) 10.00
d) 75.00
e) 20.00
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Easy
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
57. The following ANOVA table is from a multiple regression analysis with n = 35 and four independent variables.
Source | df | SS | MS | F | p |
Regression | 700 | ||||
Error | |||||
Total | 1000 |
The observed F value is __________.
a) 17.50
b) 2.33
c) 0.70
d) 0.43
e) 0.50
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
58. The following ANOVA table is from a multiple regression analysis with n = 35 and four independent variables.
Source | df | SS | MS | F | p |
Regression | 700 | ||||
Error | |||||
Total | 1000 |
The value of the standard error of the estimate se is __________.
a) 13.23
b) 3.16
c) 17.32
d) 26.46
e) 10.00
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Easy
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
59. The following ANOVA table is from a multiple regression analysis with n = 35 and four independent variables.
Source | df | SS | MS | F | p |
Regression | 700 | ||||
Error | |||||
Total | 1000 |
The R2 value is __________.
a) 0.80
b) 0.70
c) 0.66
d) 0.76
e) 0.30
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
60. The following ANOVA table is from a multiple regression analysis with n = 35 and four independent variables.
Source | df | SS | MS | F | p |
Regression | 700 | ||||
Error | |||||
Total | 1000 |
The adjusted R2 value is __________.
a) 0.80
b) 0.70
c) 0.66
d) 0.76
e) 0.30
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
61. The following ANOVA table is from a multiple regression analysis.
Source | df | SS | MS | F | p |
Regression | 3 | 1500 | |||
Error | 26 | ||||
Total | 2300 |
The sample size for the analysis is __________.
a) 30
b) 26
c) 3
d) 29
e) 31
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Easy
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
62. The following ANOVA table is from a multiple regression analysis.
Source | df | SS | MS | F | p |
Regression | 3 | 1500 | |||
Error | 26 | ||||
Total | 2300 |
The number of independent variables in the analysis is __________.
a) 30
b) 26
c) 1
d) 3
e) 2
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
63. The following ANOVA table is from a multiple regression analysis.
Source | df | SS | MS | F | p |
Regression | 3 | 1500 | |||
Error | 26 | ||||
Total | 2300 |
The MSR value is __________.
a) 1500
b) 50
c) 2300
d) 500
e) 31
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
64. The following ANOVA table is from a multiple regression analysis.
Source | df | SS | MS | F | p |
Regression | 3 | 1500 | |||
Error | 26 | ||||
Total | 2300 |
The SSE value is __________.
a) 30
b) 1500
c) 500
d) 800
e) 2300
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Easy
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
65. The following ANOVA table is from a multiple regression analysis.
Source | df | SS | MS | F | p |
Regression | 3 | 1500 | |||
Error | 26 | ||||
Total | 2300 |
The MSE value is closest to__________.
a) 31
b) 500
c) 16
d) 2300
e) 8.7
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
66. The following ANOVA table is from a multiple regression analysis.
Source | df | SS | MS | F | p |
Regression | 3 | 1500 | |||
Error | 26 | ||||
Total | 2300 |
The observed F value is __________.
- 16.25
- 30.77
c) 500
d) 0.049
e) 0.039
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
67. The following ANOVA table is from a multiple regression analysis.
Source | df | SS | MS | F | p |
Regression | 3 | 1500 | |||
Error | 26 | ||||
Total | 2300 |
The value of the standard error of the estimate se is __________.
a) 30.77
b) 5.55
c) 4.03
d) 3.20
e) 0.73
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
68. The following ANOVA table is from a multiple regression analysis.
Source | df | SS | MS | F | p |
Regression | 3 | 1500 | |||
Error | 26 | ||||
Total | 2300 |
The R2 value is __________.
a) 0.65
b) 0.53
c) 0.35
d) 0.43
e) 1.37
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
69. The following ANOVA table is from a multiple regression analysis.
Source | df | SS | MS | F | p |
Regression | 3 | 1500 | |||
Error | 26 | ||||
Total | 2300 |
The adjusted R2 value is __________.
a) 0.65
b) 0.39
c) 0.61
d) 0.53
e) 0.78
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
70. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 624.5369 | 78.49712 | 7.956176 | 6.88E-06 |
x1 | 8.569122 | 1.652255 | 5.186319 | 0.000301 |
x2 | 4.736515 | 0.699194 | 6.774248 | 3.06E-05 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 1660914 | 830457.1 | 58.31956 | 1.4E-06 |
Residual | 11 | 156637.5 | 14239.77 | ||
Total | 13 | 1817552 |
These results indicate that ____________.
a) none of the predictor variables are significant at the 5% level
b) each predictor variable is significant at the 5% level
c) x1 is the only predictor variable significant at the 5% level
d) x2 is the only predictor variable significant at the 5% level
e) the intercept is not significant at 5% level
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
71. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 624.5369 | 78.49712 | 7.956176 | 6.88E-06 |
x1 | 8.569122 | 1.652255 | 5.186319 | 0.000301 |
x2 | 4.736515 | 0.699194 | 6.774248 | 3.06E-05 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 1660914 | 830457.1 | 58.31956 | 1.4E-06 |
Residual | 11 | 156637.5 | 14239.77 | ||
Total | 13 | 1817552 |
For x1= 30 and x2 = 100, the predicted value of y is ____________.
a) 753.77
b) 1,173.00
c) 1,355.26
d) 615.13
e) 6153.13
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
72. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 624.5369 | 78.49712 | 7.956176 | 6.88E-06 |
x1 | 8.569122 | 1.652255 | 5.186319 | 0.000301 |
x2 | 4.736515 | 0.699194 | 6.774248 | 3.06E-05 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 1660914 | 830457.1 | 58.31956 | 1.4E-06 |
Residual | 11 | 156637.5 | 14239.77 | ||
Total | 13 | 1817552 |
The coefficient of multiple determination is ____________.
a) 0.0592
b) 0.9138
c) 0.1149
d) 0.9559
e) 1.0000
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
73. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 624.5369 | 78.49712 | 7.956176 | 6.88E-06 |
x1 | 8.569122 | 1.652255 | 5.186319 | 0.000301 |
x2 | 4.736515 | 0.699194 | 6.774248 | 3.06E-05 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 1660914 | 830457.1 | 58.31956 | 1.4E-06 |
Residual | 11 | 156637.5 | 14239.77 | ||
Total | 13 | 1817552 |
The adjusted R2 is ____________.
a) 0.9138
b) 0.9408
c) 0.8981
d) 0.8851
e) 0.8891
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
74. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | -139.609 | 2548.989 | -0.05477 | 0.957154 |
x1 | 24.24619 | 22.25267 | 1.089586 | 0.295682 |
x2 | 32.10171 | 17.44559 | 1.840105 | 0.08869 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 302689 | 151344.5 | 1.705942 | 0.219838 |
Residual | 13 | 1153309 | 88716.07 | ||
Total | 15 | 1455998 |
The regression equation for this analysis is ____________.
a) ŷ = 302689 + 1153309 x1 + 1455998 x2
b) ŷ = -139.609 + 24.24619 x1 + 32.10171 x2
c) ŷ = 2548.989 + 22.25267 x1 + 17.44559 x2
d) ŷ = -0.05477 + 1.089586 x1 + 1.840105 x2
e) ŷ = 0.05477 + 1.089586 x1 + 1.840105 x2
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Easy
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
75. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | -139.609 | 2548.989 | -0.05477 | 0.957154 |
x1 | 24.24619 | 22.25267 | 1.089586 | 0.295682 |
x2 | 32.10171 | 17.44559 | 1.840105 | 0.08869 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 302689 | 151344.5 | 1.705942 | 0.219838 |
Residual | 13 | 1153309 | 88716.07 | ||
Total | 15 | 1455998 |
The sample size for this analysis is ____________.
a) 17
b) 13
c) 16
d) 11
e) 15
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Easy
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
76. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | -139.609 | 2548.989 | -0.05477 | 0.957154 |
x1 | 24.24619 | 22.25267 | 1.089586 | 0.295682 |
x2 | 32.10171 | 17.44559 | 1.840105 | 0.08869 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 302689 | 151344.5 | 1.705942 | 0.219838 |
Residual | 13 | 1153309 | 88716.07 | ||
Total | 15 | 1455998 |
Using = 0.01 to test the null hypothesis H0: 1 = 2 = 0, the critical F value is ____.
a) 5.99
b) 5.70
c) 1.96
d) 4.84
e) 6.70
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
77. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | -139.609 | 2548.989 | -0.05477 | 0.957154 |
x1 | 24.24619 | 22.25267 | 1.089586 | 0.295682 |
x2 | 32.10171 | 17.44559 | 1.840105 | 0.08869 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 302689 | 151344.5 | 1.705942 | 0.219838 |
Residual | 13 | 1153309 | 88716.07 | ||
Total | 15 | 1455998 |
Using = 0.01 to test the null hypothesis H0: 2 = 0, the critical t value is ____.
a) ± 1.174
b) ± 2.093
c) ± 2.131
d) ± 4.012
e) ± 3.012
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
78. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | -139.609 | 2548.989 | -0.05477 | 0.957154 |
x1 | 24.24619 | 22.25267 | 1.089586 | 0.295682 |
x2 | 32.10171 | 17.44559 | 1.840105 | 0.08869 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 302689 | 151344.5 | 1.705942 | 0.219838 |
Residual | 13 | 1153309 | 88716.07 | ||
Total | 15 | 1455998 |
These results indicate that ____________.
a) none of the predictor variables are significant at the 5% level
b) each predictor variable is significant at the 5% level
c) x1 is the only predictor variable significant at the 5% level
d) x2 is the only predictor variable significant at the 5% level
e) all variables are significant at 5% level
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
79. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | -139.609 | 2548.989 | -0.05477 | 0.957154 |
x1 | 24.24619 | 22.25267 | 1.089586 | 0.295682 |
x2 | 32.10171 | 17.44559 | 1.840105 | 0.08869 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 302689 | 151344.5 | 1.705942 | 0.219838 |
Residual | 13 | 1153309 | 88716.07 | ||
Total | 15 | 1455998 |
For x1= 40 and x2 = 90, the predicted value of y is ____________.
a) 753.77
b) 1,173.00
c) 1,355.26
d) 3,719.39
e) 1,565.75
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
80. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | -139.609 | 2548.989 | -0.05477 | 0.957154 |
x1 | 24.24619 | 22.25267 | 1.089586 | 0.295682 |
x2 | 32.10171 | 17.44559 | 1.840105 | 0.08869 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 302689 | 151344.5 | 1.705942 | 0.219838 |
Residual | 13 | 1153309 | 88716.07 | ||
Total | 15 | 1455998 |
The coefficient of multiple determination is ____________.
a) 0.2079
b) 0. 0860
c) 0.5440
d) 0.7921
e) 0.5000
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
81. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | -139.609 | 2548.989 | -0.05477 | 0.957154 |
x1 | 24.24619 | 22.25267 | 1.089586 | 0.295682 |
x2 | 32.10171 | 17.44559 | 1.840105 | 0.08869 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 302689 | 151344.5 | 1.705942 | 0.219838 |
Residual | 13 | 1153309 | 88716.07 | ||
Total | 15 | 1455998 |
The adjusted R2 is ____________.
a) 0.2079
b) 0.0860
c) 0.5440
d) 0.7921
e) 1.0000
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
82. A multiple regression analysis produced the following output from Minitab.
Regression Analysis: Y versus x1 and x2
Predictor Coef SE Coef T P
Constant -0.0626 0.2034 -0.31 0.762
x1 1.1003 0.5441 2.02 0.058
x2 -0.8960 0.5548 -1.61 0.124
S = 0.179449 R-Sq = 89.0% R-Sq(adj) = 87.8%
Analysis of Variance
Source DF SS MS F P
Regression 2 4.7013 2.3506 73.00 0.000
Residual Error 18 0.5796 0.0322
Total 20 5.2809
These results indicate that ____________.
a) none of the predictor variables are significant at the 5% level
b) each predictor variable is significant at the 5% level
c) x1 is the only predictor variable significant at the 5% level
d) x2 is the only predictor variable significant at the 5% level
e) at least one of the variables is significant at 5% level
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Hard
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs
83. A multiple regression analysis produced the following output from Minitab.
Regression Analysis: Y versus x1 and x2
Predictor Coef SE Coef T P
Constant -0.0626 0.2034 -0.31 0.762
x1 1.1003 0.5441 2.02 0.058
x2 -0.8960 0.5548 -1.61 0.124
S = 0.179449 R-Sq = 89.0% R-Sq(adj) = 87.8%
Analysis of Variance
Source DF SS MS F P
Regression 2 4.7013 2.3506 73.00 0.000
Residual Error 18 0.5796 0.0322
Total 20 5.2809
The overall proportion of variation of y accounted by x1 and x2 is _______
a) 0.179
b) 0.89
c) 0.878
d) 0.203
e) 0.5441
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs
84. A multiple regression analysis produced the following output from Excel.
The overall proportion of variation of y accounted by x1 and x2 is _______
a) 0.9787
b) 0.9579
c) 0.9523
d) 67.671
e) 0.0489
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs
85. A multiple regression analysis produced the following output from Excel.
The coefficient of multiple determination is ____________.
a) 0.9787
b) 0.9579
c) 0.9523
d) 67.671
e) 0.0489
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs
86. A multiple regression analysis produced the following output from Excel.
The correlation coefficient is ____________.
a) 0.9787
b) 0.9579
c) 0.9523
d) 67.671
e) 0.0489
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs
87. In the regression equation ŷ = 1959.71 − 0.46 x1 + 2.16 x2, suppose that the you are considering the point (x1, x2) = (1.5, 0.5), and furthermore, suppose that the variable x1 increases by a factor of 2 (i.e., it doubles). What must be the change in the variable x2 so that y remains unchanged?
a) 2.16
b) −2.16
c) 0.319
d) −0.319
e) 0.638
Ans.: c
Response: See section 13.1 The Multiple Regression Model
Difficulty: Medium
AACSB: Reflective thinking
Bloom’s level: Application
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
88. Suppose that the regression equation ŷ = 16.99 + 0.32 x1 + 0.41 x2 + 5.31 x3 predicts an adult’s height (y) given the individual’s mother’s height (x1), his or her father’s height (x2), and whether the individual is male (x3 = 1) or female (x3 = 0). All heights are measured in inches. In this equation, the coefficient of ______ means that ______.
a) x2; if two individuals have fathers whose heights differ by 1 inch, then the individuals’ heights will differ by 0.41 inches.
b) x2; if two individuals have mothers whose heights differ by 1 inch, then the individuals’ heights will differ by 0.41 inches.
c) x3; a brother is expected to be 5.31 inches taller than his sister
d) x1; if two individuals have mothers whose heights differ by 0.32 inch, then the individuals’ heights will differ by 1 inch.
e) x1; if two individuals have mothers whose heights differ by 0.5 inch, then the individuals’ heights will differ by 0.32 inch.
Ans.: c
Response: See section 13.1 The Multiple Regression Model
Difficulty: Hard
AACSB: Reflective thinking
Bloom’s level: Application
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
89. Suppose that the regression equation ŷ = 16.99 + 0.32 x1 + 0.41 x2 + 5.31 x3 predicts an adult’s height (y) given the individual’s mother’s height (x1), his or her father’s height (x2), and whether the individual is male (x3 = 1) or female (x3 = 0). All heights are measured in inches. Assume also that this equation is stable through time, the average adult female height is currently 63.8 inches and the average adult male height is 69.7 inches. Approximately what would be the average female height in two generations? You can assume that each individual has parents of average height.
a) There is not enough information to determine the average female height in two generations.
b) 66 inches.
c) 67.25 inches
d) 67.33 inches
e) 67.82 inches
Ans.: d
Response: See section 13.1 The Multiple Regression Model
Difficulty: Hard
AACSB: Reflective thinking
Bloom’s level: Application
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
90. Suppose that the regression equation ŷ = c1 + 0.32 x1 + 0.41 x2 + 5.31 x3 predicts an adult’s height (y) given the individual’s mother’s height (x1), his or her father’s height (x2), and whether the individual is male (x3 = 1) or female (x3 = 0). All heights are measured in inches. Assume also that this equation is stable through time, the average adult female height is currently 63.8 inches and the average adult male height is 69.7 inches. If the average female height is stable through time (daughters are on average exactly as tall their mothers), then c1 = ______.
a) There is not enough information to determine the average female height in two generations.
b) 16.515
c) 15.751
d) 14.807
e) 13.155
Ans.: d
Response: See section 13.1 The Multiple Regression Model
Difficulty: Hard
AACSB: Reflective thinking
Bloom’s level: Application
Learning Objective: 13.1: Explain how, by extending the simple regression model to a multiple regression model with two independent variables, it is possible to determine the multiple regression equation for any number of unknowns.
91. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 752.0833 | 336.3158 | 2.236241 | 0.042132 |
x1 | 11.87375 | 5.32047 | 2.231711 | 0.042493 |
x2 | 1.908183 | 0.662742 | 2.879226 | 0.01213 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 203693.3 | 101846.7 | 6.745406 | 0.010884 |
Residual | 12 | 181184.1 | 15098.67 | ||
Total | 14 | 384877.4 |
Using α = 0.10 to test the null hypothesis H0: β1 = β2 = 0, the critical F value is ______.
a) 2.57
b) 2.81
c) 3.23
d) 3.89
e) 3.95
Response: See section 13.2 Significance Tests of the Regression Model and its Coefficients
Difficulty: Easy
AACSB: Reflective thinking
Bloom’s level: Application
Learning Objective: 13.2: Examine significance tests of both the overall regression model and the regression coefficients.
92. The following ANOVA table is from a multiple regression analysis.
Source | df | SS | MS | F | p |
Regression | 3 | 8157.7 | 4068.5 | 27.57 | 0.000 |
Error | 22 | 135.1 | |||
Total | 11018.4 |
The adjusted R2 value is closest to__________.
a) 0.65
b) 0.67
c) 0.68
d) 0.70
e) 0.73
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
AACSB: Reflective thinking
Bloom’s level: Application
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
93. The following ANOVA table is from a multiple regression analysis.
Source | df | SS | MS | F | p |
Regression | 3 | 1728 | |||
Error | 25 | ||||
Total | 2571 |
The R2 value is __________.
a) 0.65
b) 0.67
c) 0.69
d) 0.71
e) 0.73
Response: See section 13.3 Residuals, Standard Error of the Estimate, and R2
Difficulty: Medium
AACSB: Reflective thinking
Bloom’s level: Application
Learning Objective: 13.3: Calculate the residual, standard error of the estimate, coefficient of multiple determination, and adjusted coefficient of multiple determination of a regression model.
94. A multiple regression analysis produced the following tables.
Predictor | Coefficients | Standard Error | t Statistic | p-value |
Intercept | 512.2359 | 78.49712 | 7.956176 | 6.88E-06 |
x1 | 7.1525 | 1.652255 | 5.186319 | 0.000301 |
x2 | 2.0208 | 0.699194 | 6.774248 | 3.06E-05 |
Source | df | SS | MS | F | p-value |
Regression | 2 | 1660914 | 830457.1 | 58.31956 | 1.4E-06 |
Residual | 11 | 156637.5 | 14239.77 | ||
Total | 13 | 1817552 |
If x1= 25 and x2 = 85, then the predicted value of y is ____________.
a) 803.891
b) 807.255
c) 812.025
d) 825.517
e) 862.816
Response: See section 13.4 Interpreting Multiple Regression Computer Output
Difficulty: Medium
AACSB: Reflective thinking
Bloom’s level: Application
Learning Objective: 13.4: Use a computer to find and interpret multiple regression outputs.
Document Information
Connected Book
Business Stats Contemporary Decision 10e | Test Bank by Ken Black
By Ken Black
Explore recommendations drawn directly from what you're reading
Chapter 11 Analysis Of Variance And Design Of Experiments
DOCX Ch. 11
Chapter 12 Simple Regression Analysis And Correlation
DOCX Ch. 12
Chapter 13 Multiple Regression Analysis
DOCX Ch. 13 Current
Chapter 14 Building Multiple Regression Models
DOCX Ch. 14
Chapter 15 Time-Series Forecasting And Index Numbers
DOCX Ch. 15