Ch.14 Complete Test Bank Building Multiple Regression Models - Business Stats Contemporary Decision 10e | Test Bank by Ken Black by Ken Black. DOCX document preview.

Ch.14 Complete Test Bank Building Multiple Regression Models

File: Ch14, Chapter 14: Building Multiple Regression Models

True/False

1. Regression models in which the highest power of any predictor variable is 1 and in which there are no cross product terms are referred to as first-order models.

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Easy

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

2. The regression model y = 0 + 1 x1 + 2 x2 + 3 x1x2 + is a first order model.

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Easy

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

3. The regression model y = 0 + 1 x1 + 2 x2 + 3 x3 + is a third order model.

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Easy

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

4. The regression model y = 0 + 1 x1 + 2 x21 + is called a quadratic model.

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Easy

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

5. A linear regression model cannot be used to explore the possibility that a quadratic relationship may exist between two variables.

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

6. A linear regression model can be used to explore the possibility that a quadratic relationship may exist between two variables by suitably transforming the independent variable.

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

7. Recoding data cannot improve the fit of a regression model.

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Easy

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

8. A logarithmic transformation may be applied to both positive and negative numbers.

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

9. If a square-transformation is applied to a series of positive numbers, all greater than 1, the numerical values of the numbers in the transformed series will be smaller than the corresponding numbers in the original series.

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

10. If the effect of an independent variable (e.g., square footage) on a dependent variable (e.g., price) is affected by different ranges of values for a second independent variable (e.g., age ), the two independent variables are said to interact.

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

11. The interaction between two independent variables can be examined by including a new variable, which is the sum of the two independent variables, in the regression model.

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

12. Qualitative data can be incorporated into linear regression models using indicator variables.

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

13. A qualitative variable which represents categories such as geographical territories or job classifications may be included in a regression model by using indicator or dummy variables.

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

14. If a qualitative variable has c categories, then c dummy variables must be included in the regression model, one for each category.

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

15. If a qualitative variable has c categories, then only (c – 1) dummy variables must be included in the regression model.

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

16. If a data set contains k independent variables, the “all possible regression” search procedure will determine 2k different models.

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Medium

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

17. If a data set contains k independent variables, the “all possible regression” search procedure will determine 2k – 1 different models.

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Medium

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

18. Stepwise regression is one of the ways to prevent the problem of multicollinearity.

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Medium

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

19. If two or more independent variables are highly correlated, the regression analysis is unlikely to suffer from the problem of multicollinearity.

Response: See section 14.4 Multicollinearity

Difficulty: Easy

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

20. If each pair of independent variables is weakly correlated, there is no problem of multicollinearity.

Response: See section 14.4 Multicollinearity

Difficulty: Medium

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

21. If the variance inflation factor is bigger than 10, the regression analysis might suffer from the problem of multicollinearity.

Response: See section 14.4 Multicollinearity

Difficulty: Easy

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

22. We may use logistic regression when the dependent variable is a dummy variable, coded 0 or 1.

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Easy

Learning Objective: 14.5: Explain when to use logistic regression, and interpret its results.

23. The logistic regression model constrains the estimated probabilities to lie between 0 and 100.

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Easy

Learning Objective: 14.5: Explain when to use logistic regression, and interpret its results.

24. When structuring a logistic regression model, only one independent or predictor variable can be used.

Ans: False

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Easy

Learning Objective: 14.5: Explain when to use logistic regression, and interpret its results.

25. To test the overall effectiveness of a logistic regression, a chi-squared statistic is used.

Ans: True

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Easy

Learning Objective: 14.5: Explain when to use logistic regression, and interpret its results.

Multiple Choice

26. Multiple linear regression models can handle certain nonlinear relationships by ________.

a) biasing the sample

b) recoding or transforming variables

c) adjusting the resultant ANOVA table

d) adjusting the observed t and F values

e) performing nonlinear regression

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

27. The following scatter plot indicates that _________.

a) a log x transform may be useful

b) a y2 transform may be useful

c) a x2 transform may be useful

d) no transform is needed

e) a 1/x transform may be useful

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

28. The following scatter plot indicates that _________.

a) a log x transform may be useful

b) a log y transform may be useful

c) a x2 transform may be useful

d) no transform is needed

e) a 1/y transform may be useful

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

29. The following scatter plot indicates that _________.

a) a log x transform may be useful

b) a log y transform may be useful

c) an x2 transform may be useful

d) no transform is needed

e) a (– x) transform may be useful

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

30. The following scatter plot indicates that _________.

a) a x2 transform may be useful

b) a log y transform may be useful

c) a x4 transform may be useful

d) no transform is needed

e) a x3 transform may be useful

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

31. A multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1411.876

762.1533

1.852483

0.074919

x1

35.18215

96.8433

0.363289

0.719218

x12

7.721648

3.007943

2.567086

0.016115

df

SS

MS

F

Regression

2

58567032

29283516

57.34861

Residual

25

12765573

510622.9

Total

27

71332605

The regression equation for this analysis is ____________.

a) ŷ = 762.1533 + 96.8433 x1 + 3.007943 x12

b) ŷ = 1411.876 + 762.1533 x1 + 1.852483 x12

c) ŷ = 1411.876 + 35.18215 x1 + 7.721648 x12

d) ŷ = 762.1533 + 1.852483 x1 + 0.074919 x12

e) ŷ = 762.1533 - 1.852483 x1 + 0.074919 x12

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Easy

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

32. A multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1411.876

762.1533

1.852483

0.074919

x1

35.18215

96.8433

0.363289

0.719218

x12

7.721648

3.007943

2.567086

0.016115

df

SS

MS

F

Regression

2

58567032

29283516

57.34861

Residual

25

12765573

510622.9

Total

27

71332605

The sample size for this analysis is ____________.

a) 28

b) 25

c) 30

d) 27

e) 2

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Easy

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

33. A multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1411.876

762.1533

1.852483

0.074919

x1

35.18215

96.8433

0.363289

0.719218

x12

7.721648

3.007943

2.567086

0.016115

df

SS

MS

F

Regression

2

58567032

29283516

57.34861

Residual

25

12765573

510622.9

Total

27

71332605

Using = 0.05 to test the null hypothesis H0: 1 = 2 = 0, the critical F value is ____.

a) 4.24

b) 3.39

c) 5.57

d) 3.35

e) 2.35

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

34. A multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1411.876

762.1533

1.852483

0.074919

x1

35.18215

96.8433

0.363289

0.719218

x12

7.721648

3.007943

2.567086

0.016115

df

SS

MS

F

Regression

2

58567032

29283516

57.34861

Residual

25

12765573

510622.9

Total

27

71332605

Using = 0.10 to test the null hypothesis H0: 1 = 0, the critical t value is ____.

a) ± 1.316

b) ± 1.314

c) ± 1.703

d) ± 1.780

e) ± 1.708

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

35. A multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1411.876

762.1533

1.852483

0.074919

x1

35.18215

96.8433

0.363289

0.719218

x12

7.721648

3.007943

2.567086

0.016115

df

SS

MS

F

Regression

2

58567032

29283516

57.34861

Residual

25

12765573

510622.9

Total

27

71332605

Using = 0.10 to test the null hypothesis H0: 2 = 0, the critical t value is ____.

a) ± 1.316

b) ± 1.314

c) ± 1.703

d) ± 1.780

e) ± 1.708

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

36. A multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1411.876

762.1533

1.852483

0.074919

x1

35.18215

96.8433

0.363289

0.719218

x12

7.721648

3.007943

2.567086

0.016115

df

SS

MS

F

Regression

2

58567032

29283516

57.34861

Residual

25

12765573

510622.9

Total

27

71332605

For x1= 10, the predicted value of y is ____________.

a) 8.88.

b) 2,031.38

c) 2,53.86

d) 262.19

e) 2,535.86

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

37. A multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1411.876

762.1533

1.852483

0.074919

x1

35.18215

96.8433

0.363289

0.719218

x12

7.721648

3.007943

2.567086

0.016115

df

SS

MS

F

Regression

2

58567032

29283516

57.34861

Residual

25

12765573

510622.9

Total

27

71332605

For x1= 20, the predicted value of y is ____________.

a) 5,204.18.

b) 2,031.38

c) 2,538.86

d) 6262.19

e) 6,535.86

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

. A local parent group was concerned with the increasing cost for families with school aged children. The parent group was interested in understanding the relationship between the academic grade level for the child and the total costs spent per child per academic year. They performed a multiple regression analysis using total cost as the dependent variable and academic year (x1) as the independent variables. The multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

707.9144

435.1183

1.626947

0.114567

x1

2.903307

81.62802

0.035568

0.971871

x12

11.91297

3.806211

3.129878

0.003967

Df

SS

MS

F

p-value

Regression

2

32055153

16027577

47.34557

1.49E-09

Residual

27

9140128

338523.3

Total

29

41195281

The regression equation for this analysis is ____________.

a) ŷ = 707.9144 + 2.903307 x1 + 11.91297 x12

b) ŷ = 707.9144 + 435.1183 x1 + 1.626947 x12

c) ŷ = 435.1183 + 81.62802 x1 + 3.806211 x12

d) ŷ = 1.626947 + 0.035568 x1 + 3.129878 x12

e) ŷ = 1.626947 + 0.035568 x1 - 3.129878 x12

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

. A local parent group was concerned with the increasing cost for families with school aged children. The parent group was interested in understanding the relationship between the academic grade level for the child and the total costs spent per child per academic year. They performed a multiple regression analysis using total cost as the dependent variable and academic year (x1) as the independent variables. The multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

707.9144

435.1183

1.626947

0.114567

x1

2.903307

81.62802

0.035568

0.971871

x12

11.91297

3.806211

3.129878

0.003967

df

SS

MS

F

p-value

Regression

2

32055153

16027577

47.34557

1.49E-09

Residual

27

9140128

338523.3

Total

29

41195281

The sample size for this analysis is ____________.

a) 27

b) 29

c) 30

d) 25

e) 28

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

. A local parent group was concerned with the increasing cost for families with school aged children. The parent group was interested in understanding the relationship between the academic grade level for the child and the total costs spent per child per academic year. They performed a multiple regression analysis using total cost as the dependent variable and academic year (x1) as the independent variables. The multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

707.9144

435.1183

1.626947

0.114567

x1

2.903307

81.62802

0.035568

0.971871

x12

11.91297

3.806211

3.129878

0.003967

df

SS

MS

F

p-value

Regression

2

32055153

16027577

47.34557

1.49E-09

Residual

27

9140128

338523.3

Total

29

41195281

Using = 0.01 to test the null hypothesis H0: 1 = 2 = 0, the critical F value is ____.

a) 5.42

b) 5.49

c) 7.60

d) 3.35

e) 2.49

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

41. A local parent group was concerned with the increasing cost of school for families with school aged children. The parent group was interested in understanding the relationship between the academic grade level for the child and the total costs spent per child per academic year. They performed a multiple regression analysis using total cost as the dependent variable and academic year (x1) as the independent variables. The multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

707.9144

435.1183

1.626947

0.114567

x1

2.903307

81.62802

0.035568

0.971871

x12

11.91297

3.806211

3.129878

0.003967

df

SS

MS

F

p-value

Regression

2

32055153

16027577

47.34557

1.49E-09

Residual

27

9140128

338523.3

Total

29

41195281

Using = 0.05 to test the null hypothesis H0: 1 = 0, the critical t value is ____.

a) ± 1.311

b) ± 1.699

c) ± 1.703

d) ± 2.502

e) ± 2.052

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

42. A local parent group was concerned with the increasing cost of school for families with school aged children. The parent group was interested in understanding the relationship between the academic grade level for the child and the total costs spent per child per academic year. They performed a multiple regression analysis using total cost as the dependent variable and academic year (x1) as the independent variables. The multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

707.9144

435.1183

1.626947

0.114567

x1

2.903307

81.62802

0.035568

0.971871

x12

11.91297

3.806211

3.129878

0.003967

df

SS

MS

F

p-value

Regression

2

32055153

16027577

47.34557

1.49E-09

Residual

27

9140128

338523.3

Total

29

41195281

Using = 0.05 to test the null hypothesis H0: 2 = 0, the critical t value is ____.

a) ± 1.311

b) ± 1.699

c) ± 1.703

d) ± 2.052

e) ± 2.502

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

43. A local parent group was concerned with the increasing cost of school for families with school aged children. The parent group was interested in understanding the relationship between the academic grade level for the child and the total costs spent per child per academic year. They performed a multiple regression analysis using total cost as the dependent variable and academic year (x1) as the independent variables. The multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

707.9144

435.1183

1.626947

0.114567

x1

2.903307

81.62802

0.035568

0.971871

x12

11.91297

3.806211

3.129878

0.003967

df

SS

MS

F

p-value

Regression

2

32055153

16027577

47.34557

1.49E-09

Residual

27

9140128

338523.3

Total

29

41195281

These results indicate that ____________.

a) none of the predictor variables is significant at the 5% level

b) each predictor variable is significant at the 5% level

c) x1 is the only predictor variable significant at the 5% level

d) x12 is the only predictor variable significant at the 5% level

e) each predictor variable is insignificant at the 5% level

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

44. A local parent group was concerned with the increasing cost of school for families with school aged children. The parent group was interested in understanding the relationship between the academic grade level for the child and the total costs spent per child per academic year. They performed a multiple regression analysis using total cost as the dependent variable and academic year (x1) as the independent variables. The multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

707.9144

435.1183

1.626947

0.114567

x1

2.903307

81.62802

0.035568

0.971871

x12

11.91297

3.806211

3.129878

0.003967

df

SS

MS

F

p-value

Regression

2

32055153

16027577

47.34557

1.49E-09

Residual

27

9140128

338523.3

Total

29

41195281

For a child in grade 5 (x1= 5), the predicted value of y is ____________.

a) 707.91

b) 1,020.26

c) 781.99

d) 840.06

e) 1078.32

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Easy

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

45. A local parent group was concerned with the increasing cost of school for families with school aged children. The parent group was interested in understanding the relationship between the academic grade level for the child and the total costs spent per child per academic year. They performed a multiple regression analysis using total cost as the dependent variable and academic year (x1) as the independent variables. The multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

707.9144

435.1183

1.626947

0.114567

x1

2.903307

81.62802

0.035568

0.971871

x12

11.91297

3.806211

3.129878

0.003967

df

SS

MS

F

p-value

Regression

2

32055153

16027577

47.34557

1.49E-09

Residual

27

9140128

338523.3

Total

29

41195281

For a child in grade 10 (x1= 10) the predicted value of y is ____________.

a) 707.91

b) 1,117.38

c) 856.08

d) 2,189.54

e) 1,928.24

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Easy

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

46. After a transformation of the y-variable values into log y, and performing a regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

2.005349

0.097351

20.59923

4.81E-18

x

0.027126

0.009518

2.849843

0.008275

df

SS

MS

F

p-value

Regression

1

0.196642

0.196642

8.121607

0.008447

Residual

26

0.629517

0.024212

Total

27

0.826159

For x1= 10, the predicted value of y is ____________.

a) 155.79

b) 1.25

c) 2.42

d) 189.06

e) 18.90

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Easy

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

47. In multiple regression analysis, qualitative variables are sometimes referred to as ___.

a) dummy variables

b) quantitative variables

c) dependent variables

d) performance variables

e) cardinal variables

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis

48. If a qualitative variable has 4 categories, how many dummy variables must be created and used in the regression analysis?

a) 3

b) 4

c) 5

d) 6

e) 7

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis

49 If a qualitative variable has "c" categories, how many dummy variables must be created and used in the regression analysis?

a) c - 1

b) c

c) c + 1

d) c - 2

e) 4 + c

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis

50. Yvonne Yang, VP of Finance at Discrete Components, Inc. (DCI), wants a regression model which predicts the average collection period on credit sales. Her data set includes two qualitative variables: sales discount rates (0%, 2%, 4%, and 6%), and total assets of credit customers (small, medium, and large). The number of dummy variables needed for "sales discount rate" in Yvonne's regression model is ________.

a) 1

b) 2

c) 3

d) 4

e) 7

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

51. Yvonne Yang, VP of Finance at Discrete Components, Inc. (DCI), wants a regression model which predicts the average collection period on credit sales. Her data set includes two qualitative variables: sales discount rates (0%, 2%, 4%, and 6%), and total assets of credit customers (small, medium, and large). The number of dummy variables needed for "total assets of credit customer" in Yvonne's regression model is ________.

a) 1

b) 2

c) 3

d) 4

e) 7

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

52. Hope Hernandez is the new regional Vice President for a large gasoline station chain. She wants a regression model to predict sales in the convenience stores. Her data set includes two qualitative variables: the gasoline station location (inner city, freeway, and suburbs), and curb appeal of the convenience store (low, medium, and high). The number of dummy variables needed for "curb appeal" in Hope's regression model is ______.

a) 1

b) 2

c) 3

d) 4

e) 5

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

53. Hope Hernandez is the new regional Vice President for a large gasoline station chain. She wants a regression model to predict sales in the convenience stores. Her data set includes two qualitative variables: the gasoline station location (inner city, freeway, and suburbs), and curb appeal of the convenience store (low, medium, and high). The number of dummy variables needed for Hope's regression model is ______.

a) 2

b) 4

c) 6

d) 8

e) 9

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

54. Alan Bissell, a market analyst for City Sound Online Mart, is analyzing sales from heavy metal song downloads. Alan’s dependent variable is annual heavy metal song download sales (in $1,000,000's), and his independent variables are website visitors (in 1,000's) and type of download format requested (0 = MP3, 1 = other). Regression analysis of the data yielded the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1.7

0.384212

4.424638

0.00166

x1(website visitors)

0.04

0.014029

2.851146

0.019054

x2(download format)

-1.5666667

0.20518

-7.63558

3.21E-05

Alan's model is ________________.

a) ŷ = 1.7 + 0.384212 x1 + 4.424638 x2 + 0.00166 x3

b) ŷ = 1.7 + 0.04 x1 + 1.5666667 x2

c) ŷ = 0.384212 + 0.014029 x1 + 0.20518 x2

d) ŷ = 4.424638 + 2.851146 x1 - 7.63558 x2

e) ŷ = 1.7 + 0.04 x1 - 1.5666667 x2

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Easy

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

55. Alan Bissell, a market analyst for City Sound Online Mart, is analyzing sales from heavy metal song downloads. Alan’s dependent variable is annual heavy metal song download sales (in $1,000,000's), and his independent variables are website visitors (in 1,000's) and type of download format requested (0 = MP3, 1 = other). Regression analysis of the data yielded the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1.7

0.384212

4.424638

0.00166

x1(website visitors)

0.04

0.014029

2.851146

0.019054

x2(download format)

-1.5666667

0.20518

-7.63558

3.21E-05

For MP3 sales with 10,000 website visitors, Alan's model predicts annual sales of heavy metal song downloads of ________________.

a) $2,100,000

b) $524,507

c) $533,333

d) $729,683

e) $21,000,000

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Easy

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

56. Alan Bissell, a market analyst for City Sound Online Mart, is analyzing sales from heavy metal song downloads. Alan’s dependent variable is annual heavy metal song download sales (in $1,000,000's), and his independent variables are website visitors (in 1,000's) and type of download format requested (0 = MP3, 1 = other). Regression analysis of the data yielded the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1.7

0.384212

4.424638

0.00166

x1(website visitors)

0.04

0.014029

2.851146

0.019054

x2(download format)

-1.5666667

0.20518

-7.63558

3.21E-05

For ‘other’ download formats with 10,000 website visitors, Alan's model predicts annual sales of heavy metal song downloads of ________________.

a) $2,100,000

b) $524,507

c) $533,333

d) $729,683

e) $210,000

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Easy

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

57. Alan Bissell, a market analyst for City Sound Online Mart, is analyzing sales from heavy metal song downloads. Alan’s dependent variable is annual heavy metal song download sales (in $1,000,000's), and his independent variables are website visitors (in 1,000's) and type of download format requested (0 = MP3, 1 = other). Regression analysis of the data yielded the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1.7

0.384212

4.424638

0.00166

x1(website visitors)

0.04

0.014029

2.851146

0.019054

x2(download format)

-1.5666667

0.20518

-7.63558

3.21E-05

For the same number of website visitors, what is difference between the predicted sales for MP3 versus ‘other’ heavy metal song downloads

a) $1,566,666 higher sales for ‘other’ formats

b) the same sales for both formats

c) $1,566,666 lower sales for the ‘other’ format

d) $1,700,000 higher sales for the MP3 format

e) $ 1,700,000 lower sales for the ‘other’ format

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

58. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is analyzing household budget data collected by her firm. Abby's dependent variable is weekly household expenditures on groceries (in $'s), and her independent variables are annual household income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis of the data yielded the following table.

Coefficients

Standard Error

t Statistic

p-value

Intercept

19.68247

10.01176

1.965934

0.077667

X1 (income)

1.735272

0.174564

9.940612

1.68E-06

X2 (neighborhood)

49.12456

7.655776

6.416667

7.67E-05

Abby's model is ________________.

a) ŷ = 19.68247 + 10.01176 x1 + 1.965934 x2

b) ŷ = 1.965934 + 9.940612 x1 + 6.416667 x2

c) ŷ = 10.01176 + 0.174564 x1 + 7.655776 x2

d) ŷ = 19.68247 - 1.735272 x1 + 49.12456 x2

e) ŷ = 19.68247 + 1.735272 x1 + 49.12456 x2

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Easy

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

59. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is analyzing household budget data collected by her firm. Abby's dependent variable is weekly household expenditures on groceries (in $'s), and her independent variables are annual household income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis of the data yielded the following table.

Coefficients

Standard Error

t Statistic

p-value

Intercept

19.68247

10.01176

1.965934

0.077667

x1 (income)

1.735272

0.174564

9.940612

1.68E-06

x2 (neighborhood)

49.12456

7.655776

6.416667

7.67E-05

For a rural household with $90,000 annual income, Abby's model predicts weekly grocery expenditure of ________________.

a) $156.19

b) $224.98

c) $444.62

d) $141.36

e) $175.86

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

60. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is analyzing household budget data collected by her firm. Abby's dependent variable is weekly household expenditures on groceries (in $'s), and her independent variables are annual household income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis of the data yielded the following table.

Coefficients

Standard Error

t Statistic

p-value

Intercept

19.68247

10.01176

1.965934

0.077667

x1 (income)

1.735272

0.174564

9.940612

1.68E-06

x2 (neighborhood)

49.12456

7.655776

6.416667

7.67E-05

For a suburban household with $90,000 annual income, Abby's model predicts weekly grocery expenditure of ________________.

a) $156.19

b) $224.98

c) $444.62

d) $141.36

e) $175.86

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

61. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is analyzing household budget data collected by her firm. Abby's dependent variable is weekly household expenditures on groceries (in $'s), and her independent variables are annual household income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis of the data yielded the following table.

Coefficients

Standard Error

t Statistic

p-value

Intercept

19.68247

10.01176

1.965934

0.077667

x1 (income)

1.735272

0.174564

9.940612

1.68E-06

x2 (neighborhood)

49.12456

7.655776

6.416667

7.67E-05

For two households, one suburban and one rural, Abby's model predicts ________.

a) equal weekly expenditures for groceries

b) the suburban household's weekly expenditures for groceries will be $49 more

c) the rural household's weekly expenditures for groceries will be $49 more

d) the suburban household's weekly expenditures for groceries will be $8 more

e) the rural household's weekly expenditures for groceries will be $49 less

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

62. Which of the following iterative search procedures for model-building in a multiple regression analysis reevaluates the contribution of variables previously include in the model after entering a new independent variable?

a) Backward elimination

b) Stepwise regression

c) Forward selection

d) All possible regressions

e) Backward selection

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Medium

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

63. Which of the following iterative search procedures for model-building in a multiple regression analysis starts with all independent variables in the model and then drops non-significant independent variables is a step-by-step manner?

a) Backward elimination

b) Stepwise regression

c) Forward selection

d) All possible regressions

e) Backward selection

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Medium

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

64. Which of the following iterative search procedures for model-building in a multiple regression analysis adds variables to model as it proceeds, but does not reevaluate the contribution of previously entered variables?

a) Backward elimination

b) Stepwise regression

c) Forward selection

d) All possible regressions

e) Forward elimination

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Medium

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

65. An "all possible regressions" search of a data set containing 7 independent variables will produce ______ regressions.

a) 13

b) 127

c) 48

d) 64

e) 97

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Hard

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

66. An "all possible regressions" search of a data set containing 5 independent variables will produce ______ regressions.

a) 31

b) 10

c) 25

d) 32

e) 24

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Hard

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

67. An "all possible regressions" search of a data set containing 8 independent variables will produce ______ regressions.

a) 8

b) 15

c) 256

d) 64

e) 255

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Hard

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

68. An "all possible regressions" search of a data set containing "k" independent variables will produce __________ regressions.

a) 2k -1

b) 2k - 1

c) k2 - 1

d) 2k - 1

e) 2k

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Medium

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

69. Inspection of the following table of t values for variables in a multiple regression analysis reveals that the first independent variable entered by the forward selection procedure will be ___________.

y

x1

x2

x3

x4

x5

y

1

x1

-0.1661

1

x2

0.231849

-0.51728

1

x3

0.423522

-0.22264

-0.00734

1

x4

-0.33227

0.028957

-0.49869

0.260586

1

x5

0.199796

-0.20467

0.078916

0.207477

0.023839

1

a) x2

b) x3

c) x4

d) x5

e) x1

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Medium

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

70. Inspection of the following table of t values for variables in a multiple regression analysis reveals that the first independent variable entered by the forward selection procedure will be ___________.

y

x1

x2

x3

x4

x5

y

1

x1

-0.44008

1

x2

0.566053

-0.51728

1

x3

0.064919

-0.22264

-0.00734

1

x4

-0.35711

0.028957

-0.49869

0.260586

1

x5

0.426363

-0.20467

0.078916

0.207477

0.023839

1

a) x1

b) x2

c) x3

d) x4

e) x5

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Easy

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

71. Inspection of the following table of t values for variables in a multiple regression analysis reveals that the first independent variable that will be entered into the regression model by the forward selection procedure will be ___________.

y

x1

x2

x3

x4

x5

y

1

x1

-0.0857

1

x2

-0.20246

0.868358

1

x3

-0.22631

-0.10604

-0.14853

1

x4

-0.28175

-0.0685

0.41468

-0.14151

1

x5

0.271105

0.150796

0.129388

-0.15243

0.00821

1

a) x1

b) x2

c) x3

d) x4

e) x5

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Easy

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

72. Inspection of the following table of t values for variables in a multiple regression analysis reveals that the first independent variable that will be entered into the regression model by the forward selection procedure will be ___________.

y

x1

x2

x3

x4

x5

y

1

x1

0.854168

1

x2

-0.11828

-0.00383

1

x3

-0.12003

-0.08499

-0.14523

1

x4

0.525901

0.118169

-0.14876

0.050042

1

x5

-0.18105

-0.07371

0.995886

-0.14151

-0.16934

1

a) x1

b) x2

c) x3

d) x4

e) x5

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Easy

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

73. Carlos Cavazos, Director of Human Resources, is exploring employee absenteeism at the Plano Piano Plant. A multiple regression analysis was performed using the following variables. The results are presented below.

Variable

Description

Y

number of days absent last fiscal year

x1

commuting distance (in miles)

x2

employee's age (in years)

x3

single-parent household (0 = no, 1 = yes)

x4

length of employment at PPP (in years)

x5

shift (0 = day, 1 = night)

Coefficients

Standard Error

t Statistic

p-value

Intercept

6.594146

3.273005

2.014707

0.047671

x1

-0.18019

0.141949

-1.26939

0.208391

x2

0.268156

0.260643

1.028828

0.307005

x3

-2.31068

0.962056

-2.40182

0.018896

x4

-0.50579

0.270872

-1.86725

0.065937

x5

2.329513

0.940321

2.47736

0.015584

df

SS

MS

F

p-value

Regression

5

279.358

55.8716

4.423755

0.001532

Residual

67

846.2036

12.6299

Total

72

1125.562

R = 0.498191

R2 = 0.248194

Adj R2 = 0.192089

se = 3.553858

n = 73

Which of the following conclusions can be drawn from the above results?

a) All the independent variables in the regression are significant at 5% level.

b) Commuting distance is a highly significant (<1%) variable in explaining absenteeism.

c) Age of the employees tends to have a very significant (<1%) effect on absenteeism.

d) This model explains a little over 49% of the variability in absenteeism data.

e) A single-parent household employee is expected to be absent fewer days, all other variables held constant, compared to one who is not a single-parent household.

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Hard

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

74. Large correlations between two or more independent variables in a multiple regression model could result in the problem of ________.

a) multicollinearity

b) autocorrelation

c) partial correlation

d) rank correlation

e) non-normality

Response: See section 14.4 Multicollinearity

Difficulty: Medium

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

75. An appropriate method to identify multicollinearity in a regression model is to ____.

a) examine a residual plot

b) examine the ANOVA table

c) examine a correlation matrix

d) examine the partial regression coefficients

e) examine the R2 of the regression model

Response: See section 14.4 Multicollinearity

Difficulty: Medium

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

76. An acceptable method of managing multicollinearity in a regression model is the ___.

a) use the forward selection procedure

b) use the backward elimination procedure

c) use the forward elimination procedure

d) use the stepwise regression procedure

e) use all possible regressions

Response: See section 14.4 Multicollinearity

Difficulty: Medium

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

77. A useful technique in controlling multicollinearity involves the use of _________.

a) variance inflation factors

b) a backward elimination procedure

c) a forward elimination procedure

d) a forward selection procedure

e) all possible regressions

Response: See section 14.4 Multicollinearity

Difficulty: Medium

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

78. Inspection of the following table of correlation coefficients for variables in a multiple regression analysis reveals potential multicollinearity with variables ___________.

y

x1

x2

x3

x4

x5

y

1

x1

-0.0857

1

x2

-0.20246

0.868358

1

x3

-0.22631

-0.10604

-0.14853

1

x4

-0.28175

-0.0685

0.41468

-0.14151

1

x5

0.271105

0.150796

0.129388

-0.15243

0.00821

1

a) x1 and x2

b) x1 and x4

c) x4 and x5

d) x4 and x3

e) x5 and y

Response: See section 14.4 Multicollinearity

Difficulty: Medium

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

79. Inspection of the following table of correlation coefficients for variables in a multiple regression analysis reveals potential multicollinearity with variables ___________.

y

x1

x2

x3

x4

x5

y

1

x1

-0.08301

1

x2

0.236745

-0.51728

1

x3

0.155149

-0.22264

-0.00734

1

x4

0.022234

-0.58079

0.884216

0.131956

1

x5

0.4808

-0.20467

0.078916

0.207477

0.103831

1

a) x1 and x5

b) x2 and x3

c) x4 and x2

d) x4 and x3

e) x4 and y

Response: See section 14.4 Multicollinearity

Difficulty: Medium

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

80. Inspection of the following table of correlation coefficients for variables in a multiple regression analysis reveals potential multicollinearity with variables ___________.

y

x1

x2

x3

x4

x5

y

1

x1

0.854168

1

x2

-0.11828

-0.00383

1

x3

-0.12003

-0.08499

-0.14523

1

x4

0.525901

0.118169

-0.14876

0.050042

1

x5

-0.18105

-0.07371

0.995886

-0.14151

-0.16934

1

a) x1 and x2

b) x1 and x5

c) x3 and x4

d) x2 and x5

e) x3 and x5

Response: See section 14.4 Multicollinearity

Difficulty: Medium

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.



81. Suppose a company is interested in understanding the effect of age and sex on the likelihood a customer will purchase a new product. The data analyst intends to run a logistic regression on her data. Which of the following variable(s) will the analyst need to code as 0 or 1 prior to performing the logistic regression analysis?

a) age and gender

b) age and purchase status

c) age

d) purchase status

e) sex and purchase status

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Easy

Learning Objective: 14.5: Explain when to use logistic regression, and interpret its results.



82. Suppose a community based political group is interested in determining if there is a relationship between the years that a candidate lives in a community, the number of volunteer hours the candidate gives to the community, and the outcome of the candidate in the local city council election. Which of the following statements is not true about the experimental design?

a) The election outcome (win/ lose) is the response variable.

b) The number of hours the candidate gives in volunteering is the dependent variable.

c) A correlation analysis should evaluate possible multicollinearity between the years a candidate lives in a community and the number of volunteer hours.

d) The number of years a candidate lives in a community is an independent variable.

e) The only variable that must be recoded 0 or 1 is the response variable.

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Hard

Learning Objective: 14.5: when to use logistic regression and interpret its results.

83. A research project was conducted to study the effect of smoking and weight upon resting pulse rate. The response variable is coded as 1 when the pulse rate is low and 0 when it high. Smoking is also coding as 1 when smoking and 0 when not smoking. Shown below is the Minitab output from a logistic regression.

Response Information

Variable Value Count

Rating Pulse 1 70 (Event)

0 22

Total 92

Logistic Regression Table

Odds 95% CI

Predictor Coef SE Coef Z P Ratio Lower Upper

Constant -1.98717 1.67930 -1.18 0.237

Weight 0.0250226 0.0122551 2.04 0.041 1.03 1.00 1.05

Smokes -1.19297 0.552980 -2.16 0.031 0.30 0.10 0.90

Log-Likelihood = -46.820

Test that all slopes are zero: G = 7.574, DF = 2, P-Value = 0.023

The log of the odds ratio or logit equation is:

a) ln(S)=-1.19297+0.0250226 Weight-1.98717 Smokes

b) S=-1.98717+0.025226 Weight-1.19297 Smokes

c) Rating Pulse=-1.98717+0.025226 Weight-1.19297 Smokes

d) ln(S) =-1.98717+0.025226 Weight-1.19297 Smokes

e) ln(p)=-1.98717+0.025226 Weight-1.19297 Smokes

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Easy

Learning Objective: 14.5: Explain when to use logistic regression, and interpret its results.



84. A research project was conducted to study the effect of smoking and weight upon resting pulse rate. The response variable is coded as 1 when the pulse rate is low and 0 when it high. Smoking is also coding as 1 when smoking and 0 when not smoking. Shown below is the Minitab output from a logistic regression.

Response Information

Variable Value Count

Rating Pulse 1 70 (Event)

0 22

Total 92

Logistic Regression Table

Odds 95% CI

Predictor Coef SE Coef Z P Ratio Lower Upper

Constant -1.98717 1.67930 -1.18 0.237

Weight 0.0250226 0.0122551 2.04 0.041 1.03 1.00 1.05

Smokes -1.19297 0.552980 -2.16 0.031 0.30 0.10 0.90

Log-Likelihood = -46.820

Test that all slopes are zero: G = 7.574, DF = 2, P-Value = 0.023

The predicted probability that a 150 pounds person who smokes has a low pulse rate is closest to __________.

a) 0.6395

b) 0.8540

c) 0.2145

d) 0.5

e) 0.9871

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Hard

Learning Objective: 14.5: Explain when to use logistic regression, and interpret its results.



85. A research project was conducted to study the effect of smoking and weight upon resting pulse rate. The response variable is coded as 1 when the pulse rate is low and 0 when it high. Smoking is also coding as 1 when smoking and 0 when not smoking. Shown below is the Minitab output from a logistic regression.

Response Information

Variable Value Count

Rating Pulse 1 70 (Event)

0 22

Total 92

Logistic Regression Table

Odds 95% CI

Predictor Coef SE Coef Z P Ratio Lower Upper

Constant -1.98717 1.67930 -1.18 0.237

Weight 0.0250226 0.0122551 2.04 0.041 1.03 1.00 1.05

Smokes -1.19297 0.552980 -2.16 0.031 0.30 0.10 0.90

Log-Likelihood = -46.820

Test that all slopes are zero: G = 7.574, DF = 2, P-Value = 0.023

The predicted probability that a 150 pounds person who does not smoke has a low pulse rate is closest to _______.

a) 0.6395

b) 0.8540

c) 0.2145

d) 0.5

e) 0.9871

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Hard

Learning Objective: 14.5: Explain when to use logistic regression, and interpret its results.

86. A research project was conducted to study the effect of smoking and weight upon resting pulse rate. The response variable is coded as 1 when the pulse rate is low and 0 when it high. Smoking is also coding as 1 when smoking and 0 when not smoking. Shown below is the Minitab output from a logistic regression.

Response Information

Variable Value Count

Rating Pulse 1 70 (Event)

0 22

Total 92

Logistic Regression Table

Odds 95% CI

Predictor Coef SE Coef Z P Ratio Lower Upper

Constant -1.98717 1.67930 -1.18 0.237

Weight 0.0250226 0.0122551 2.04 0.041 1.03 1.00 1.05

Smokes -1.19297 0.552980 -2.16 0.031 0.30 0.10 0.90

Log-Likelihood = -46.820

Test that all slopes are zero: G = 7.574, DF = 2, P-Value = 0.023

The predicted probability that a 150 pounds person who does not smoke has a high pulse rate is closest to _______.

a) 0.6395

b) 0.8540

c) 0.2145

d) 0.1459

e) 0.9871

Ans: d

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Hard

Learning Objective: 14.5: Explain when to use logistic regression, and interpret its results.

87. A research project was conducted to study the effect of smoking and weight upon resting pulse rate. The response variable is coded as 1 when the pulse rate is low and 0 when it high. Smoking is also coding as 1 when smoking and 0 when not smoking. Shown below is the Minitab output from a logistic regression.

Response Information

Variable Value Count

Rating Pulse 1 70 (Event)

0 22

Total 92

Logistic Regression Table

Odds 95% CI

Predictor Coef SE Coef Z P Ratio Lower Upper

Constant -1.98717 1.67930 -1.18 0.237

Weight 0.0250226 0.0122551 2.04 0.041 1.03 1.00 1.05

Smokes -1.19297 0.552980 -2.16 0.031 0.30 0.10 0.90

Log-Likelihood = -46.820

Test that all slopes are zero: G = 7.574, DF = 2, P-Value = 0.023

Based on the results, the null hypothesis that all of the predictor variables are insignificant with coefficients of 0, should __________________.

a) not be rejected as the p-value is 7.574

b) be rejected as the p-value is -46.82

c) be rejected as the p-value is 0.023

d) not be rejected as the p-value is 0.023

e) not be rejected as there is insufficient information

Ans: c

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Hard

Learning Objective: 14.5: Explain when to use logistic regression, and interpret its results.

88. A research project was conducted to study the effect of smoking and weight upon resting pulse rate. The response variable is coded as 1 when the pulse rate is low and 0 when it high. Smoking is also coding as 1 when smoking and 0 when not smoking. Shown below is the Minitab output from a logistic regression.

Response Information

Variable Value Count

Rating Pulse 1 70 (Event)

0 22

Total 92

Logistic Regression Table

Odds 95% CI

Predictor Coef SE Coef Z P Ratio Lower Upper

Constant -1.98717 1.67930 -1.18 0.237

Weight 0.0250226 0.0122551 2.04 0.041 1.03 1.00 1.05

Smokes -1.19297 0.552980 -2.16 0.031 0.30 0.10 0.90

Log-Likelihood = -46.820

Test that all slopes are zero: G = 7.574, DF = 2, P-Value = 0.023

Which of the individual predictors are found to be significant predictors of resting pulse rates at an alpha of 0.05?

a) weight

b) constant and weight

c) constant

d) smokes and constant

e) smokes and weight

Ans: e

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Hard

Learning Objective: 14.5: Explain when to use logistic regression, and interpret its results.

89. A multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1411.876

7.621533

x1

35.18215

96.8433

x12

−7.721648

3.007943

df

SS

MS

F

Regression

2

58567032

29283516

57.34861

Residual

25

12765573

510622.9

Total

27

71332605

The minimum value of the predicted value of the dependent variable is reached when
x1 = ______.

a) 2.15815

b) 3.18512

c) 3.37785

d) 3.40125

e) a value not listed here

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Hard

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

90. A multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1411.876

7.621533

x1

35.18215

96.8433

x12

−7.721648

3.007943

df

SS

MS

F

Regression

2

58567032

29283516

57.34861

Residual

25

12765573

510622.9

Total

27

71332605

If the predicted value of the dependent variable is 1000, then x1 = ______.

a) 4.50666

b) 8.158666

c) 9.928668

d) 10.15866

e) 11.928666

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Hard

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

91. A multiple regression analysis produced the following tables.

Coefficients

Standard Error

t Statistic

p-value

Intercept

1411.876

7.621533

x1

35.18215

96.8433

x12

−7.721648

3.007943

df

SS

MS

F

Regression

2

58567032

29283516

57.34861

Residual

21

12765573

510622.9

Total

23

71332605

Using = 0.01 to test the null hypothesis H0: 1 = 2 = 0, the critical F value is ______.

a) 8.09

b) 6.89

c) 6.09

d) 5.78

e) 5.09

Response: See section 14.1 Nonlinear Models: Mathematical Transformation

Difficulty: Medium

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.1: Generalize linear regression models as polynomial regression models using model transformation and Tukey’s ladder of transformation, accounting for possible interaction among the independent variables.

92. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is analyzing household budget data collected by her firm. Abby's dependent variable is weekly household expenditures on groceries (in $'s), and her independent variables are annual household income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis of the data yielded the following table.

Coefficients

Standard Error

t Statistic

p-value

Intercept

19.68247

10.01176

1.965934

0.077667

x1 (income)

1.735272

0.174564

9.940612

1.68E-06

x2 (neighborhood)

49.12456

7.655776

6.416667

7.67E-05

For a rural household with ______ annual income, Abby's model predicts weekly grocery expenditure of $235.

a) $90,753.44

b) $92,257.51

c) $94,734.77

d) $95,773.44

e) $96,737.71

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

93. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is analyzing household budget data collected by her firm. Abby's dependent variable is weekly household expenditures on groceries (in $'s), and her independent variables are annual household income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis of the data yielded the following table.

Coefficients

Standard Error

t Statistic

p-value

Intercept

19.68247

10.01176

1.965934

0.077667

x1 (income)

1.735272

0.174564

9.940612

1.68E-06

x2 (neighborhood)

49.12456

7.655776

6.416667

7.67E-05

For a suburban household with ______ annual income, Abby's model predicts weekly grocery expenditure of $235.

a) $120,082.9

b) $122,082.9

c) $124,082.9

d) $126.082.9

e) $128,082.9

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

94. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is analyzing household budget data collected by her firm. Abby's dependent variable is weekly household expenditures on groceries (in $'s), and her independent variables are annual household income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis of the data yielded the following table.

Coefficients

Standard Error

t Statistic

p-value

Intercept

19.68247

10.01176

1.965934

0.077667

x1 (income)

1.735272

0.174564

9.940612

1.68E-06

x2 (neighborhood)

49.12456

7.655776

6.416667

7.67E-05

For what annual income will a suburban household and a rural household have the same predicted weekly grocery spending?

a) $120,082.9

b) $122,082.9

c) $124,082.9

d) The predicted weekly grocery spending of a rural household will always will be larger for a given annual income.

e) The predicted weekly grocery spending of a suburban household will always will be larger for a given annual income.

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Medium

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

95. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is analyzing household budget data collected by her firm. Abby's dependent variable is weekly household expenditures on groceries (in $'s), and her independent variables are annual household income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis of the data yielded the following table.

Coefficients

Standard Error

t Statistic

p-value

Intercept

19.68247

10.01176

1.965934

0.077667

x1 (income)

1.735272

0.174564

9.940612

1.68E-06

x2 (neighborhood)

49.12456

7.655776

6.416667

7.67E-05

However, Abby has reasons to believe that actually suburban households have a higher propensity to spend in groceries than rural households, as they tend to make more impulse purchasing decisions. In this new model, the actual income coefficient for the suburban families is 1.735272α, for some α > 1. If new data confirms that suburban houses with an annual income of $95,773.44 have the same weekly grocery spending as rural households, then α = ______.

a) 1.055587

b) 1.165587

c) 1.245587

d) 1.275587

e) 1.295587

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Hard

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

96. Abby Kratz, a market specialist at the market research firm of Saez, Sikes, and Spitz, is analyzing household budget data collected by her firm. Abby's dependent variable is weekly household expenditures on groceries (in $'s), and her independent variables are annual household income (in $1,000's) and household neighborhood (0 = suburban, 1 = rural). Regression analysis of the data yielded the following table.

Coefficients

Standard Error

t Statistic

p-value

Intercept

19.68247

10.01176

1.965934

0.077667

x1 (income)

1.735272

0.174564

9.940612

1.68E-06

x2 (neighborhood)

49.12456

7.655776

6.416667

7.67E-05

The marginal propensity to consume (MPC) is defined in economics as the proportion of an additional dollar of income that a household (or individual) consumes. Assume that grocery spending is the main expenditure of households. Then according to the regression analysis above, the MPC ______.

a) of rural households is equal to that of urban households, and this MPC decreases as households increase their annual income

b) of rural households is larger than that of urban households, and both MPCs decrease as households increase their annual income

c) of rural households is equal to that of urban households, and this MPC is constant as households increase their annual income

d) of rural households is larger than that of urban households, and both MPCs are constant as households increase their annual income

e) of rural households is larger than that of urban households, and both MPCs increase as households increase their annual income

Response: See section 14.2 Indicator (Dummy) Variables

Difficulty: Hard

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.2: Examine the role of indicator, or dummy, variables as predictors or independent variables in multiple regression analysis.

97. If there are 6 independent variables, then there are ______ possible regressions with 2 predictors and ______ possible regressions with 3 predictors.

a) 12; 18

b) 15; 18

c) 12; 20

d) 18; 20

e) 15; 20

Response: See section 14.3 Model-Building: Search Procedures

Difficulty: Medium

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.3: Use all possible regressions, stepwise regression, forward selection, and backward elimination search procedures to develop regression models that account for the most variation in the dependent variable and are parsimonious.

98. A researcher wants to address multicollinearity using the guideline that a variance inflation factor (VIF) greater than 9 will indicate a severe multicollinearity problem. This means that the highest coefficient of determination allowed will be ______.

a) 0.900

b) 0.912

c) 0.928

d) 0.889

e) 0.939

Response: See section 14.4 Multicollinearity

Difficulty: Medium

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

99. If a coefficient of determination for a given model is 0.85, then its variance inflation factor is ______.

a) 8.5

b) 7.7

c) 6.2

d) 4.9

e) 6.7

Response: See section 14.4 Multicollinearity

Difficulty: Medium

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

100. A regression analysis conducted to predict an independent variable by other independent variables ______.

a) is never done; predicted variables are dependent by definition

b) done to control multicollinearity

c) done to avoid multicollinearity

d) done to detect multicollinearity

e) done implicitly when the correlation matrix is computed

Response: See section 14.3 Model-Building: Search Procedures

Response: See section 14.4 Multicollinearity

Difficulty: Medium

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

101. Which of the following problems is not caused by multicollinearity?

a) It is difficult or even impossible to interpret the estimates of the regression coefficients.

b) Inordinately small t values for the regression coefficients may result

c) The standard deviations of the regression coefficients are underestimated

d) The algebraic sign of estimated regression coefficients may be the opposite of what would be expected for a predictor variable.

e) The variance of a given regression coefficient is larger than it would otherwise have been if the variable had been completely uncorrelated with all the other variables in the model.

Response: See section 14.4 Multicollinearity

Difficulty: Medium

AACSB: Reflective thinking

Bloom’s level: Application

Learning Objective: 14.4: Recognize when multicollinearity is present, understanding general techniques for preventing and controlling it.

102. A research project was conducted to study the effect of a chemical on undesired insects. The researcher uses 6 dose levels, and at each level exposes 250 insects to the chemical and proceeds to count the number of insects that die. The researcher uses a binary logistic regression model to estimate the probability of death as a function of dose.

Shown below is Minitab output from a logistic regression.

Coefficients

Term Coef SE Coef 95% CI Z-Value P-Value VIF

Constant -2.644 0.156 (-2.950, -2.338) -16.94 0.000

Dose 0.6740 0.0391 (0.5973, 0.7506) 17.23 0.000 1.00

Odd Ratios for Continuous Predictors

Odds Ratio 95% CI

Dose 1.9621 (1.8173, 2.1184)

The predicted probability that an insect will die when exposed to the second dose is ____.

a) 0.1246

b) 0.1446

c) 0.1857

d) 0.2148

e) 0.2945

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Hard

Learning Objective: 14.5: Understand when to use logistic regression and be able to interpret its results.

103. A research project was conducted to study the effect of a chemical on undesired insects. The researcher uses 6 dose levels, and at each level exposes 250 insects to the chemical and proceeds to count the number of insects that die. The researcher uses a binary logistic regression model to estimate the probability of death as a function of dose.

Shown below is Minitab output from a logistic regression.

Coefficients

Term Coef SE Coef 95% CI Z-Value P-Value VIF

Constant -2.644 0.156 (-2.950, -2.338) -16.94 0.000

Dose 0.6740 0.0391 (0.5973, 0.7506) 17.23 0.000 1.00

Odd Ratios for Continuous Predictors

Odds Ratio 95% CI

Dose 1.9621 (1.8173, 2.1184)

The predicted probability that an insect will die when exposed to the first dose is ____.

a) 0.0814

b) 0.1046

c) 0.1153

d) 0.1224

e) 0.1395

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Hard

Learning Objective: 14.5: Understand when to use logistic regression and be able to interpret its results.

104. A research project was conducted to study the effect of a chemical on undesired insects. The researcher uses 6 dose levels, and at each level exposes 250 insects to the chemical and proceeds to count the number of insects that die. The researcher uses a binary logistic regression model to estimate the probability of death as a function of dose.

Shown below is Minitab output from a logistic regression.

Coefficients

Term Coef SE Coef 95% CI Z-Value P-Value VIF

Constant -2.644 0.156 (-2.950, -2.338) -16.94 0.000

Dose 0.6740 0.0391 (0.5973, 0.7506) 17.23 0.000 1.00

Odd Ratios for Continuous Predictors

Odds Ratio 95% CI

Dose 1.9621 (1.8173, 2.1184)

The estimated odds of death at a given level divided by the estimated odds of death at the following level equals = ______.

a) 1.9621

b) 0.6470

c) 0.5097

d) 0.3271

e) 0.3008

Response: See section 14.5, The Logistic Regression Model.

Difficulty: Hard

Learning Objective: 14.5: Understand when to use logistic regression and be able to interpret its results.

Document Information

Document Type:
DOCX
Chapter Number:
14
Created Date:
Aug 21, 2025
Chapter Name:
Chapter 14 Building Multiple Regression Models
Author:
Ken Black

Connected Book

Business Stats Contemporary Decision 10e | Test Bank by Ken Black

By Ken Black

Test Bank General
View Product →

$24.99

100% satisfaction guarantee

Buy Full Test Bank

Benefits

Immediately available after payment
Answers are available after payment
ZIP file includes all related files
Files are in Word format (DOCX)
Check the description to see the contents of each ZIP file
We do not share your information with any third party