- Last updated

- Save as PDF

- Page ID
- 24073

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

\( \newcommand{\vectorC}[1]{\textbf{#1}}\)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}}\)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}}\)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{#1}}} \)

Two-way analysis of variance (two-way ANOVA) is an extension of one-way ANOVA. It can be used to compare the means of two independent variables or **factors** from two or more populations. It can also be used to test for interaction between the two independent variables.

We will not be doing the sum of squares calculations by hand. These numbers will be given to you in a partially filled out ANOVA table or an Excel output will be given in the problem.

There are three sets of hypotheses for testing the equality of \(k\) population means from two independent variables, and to test for interaction between the two variables (two-way ANOVA):

Row Effect (Factor A): | \(H_{0}:\) The row variable has no effect on the average ___________________.\(H_{1}:\) The row variable has an effect on the average ___________________. |

Column Effect (Factor B): | \(H_{0}\): The column variable has no effect on the average ___________________.\(H_{1}\): The column variable has an effect on the average ___________________. |

Interaction Effect (A×B): | \(H_{0}:\) There is no interaction effect between row variable and column variable on the average ___________________.\(H_{1}:\) There is an interaction effect between row variable and column variable on the average ___________________. |

These ANOVA tests are always right-tailed F-tests.

The F-test (for two-way ANOVA) is a statistical test for testing the equality of k independent quantitative population means from two nominal variables, called factors. The two-way ANOVA also tests for interaction between the two factors.

Assumptions:

- The populations are normal.
- The observations are independent.
- The variances from each population are equal.
- The groups must have equal sample sizes.

The formulas for the F-test statistics are:

Factor 1: | \(F_{A} = \frac{MS_{A}}{MSE}\) with \(df_{A} = a-1\) and \(df_{\text{E}} = ab(n-1)\) |

Factor 2: | \(F_{B} = \frac{MS_{B}}{MSE}\) with \(df_{B} = b-1\) and \(df_{\text{E}} = ab(n-1)\) |

Interaction: | \(F_{A \times B} = \frac{MS_{A \times B}}{MSE}\) with \(df_{A \times B} = (a-1)(b-1)\) and \(df_{\text{E}} = ab(n-1)\) |

Where:

\(SS_{\text{A}}\) = sum of squares for factor A, the row variable

\(SS_{\text{B}}\) = sum of squares for factor B, the column variable

\(SS_{\text{A} \times \text{B}}\) = sum of squares for interaction between factor A and B

\(SSE\) = sum of squares of error, also called sum of squares within groups

\(a\) = number of levels of factor A

\(b\) = number of levels of factor B

\(n\) = number of subjects in each group

It will be helpful to make a table. Figure 11-5 is called a two-way ANOVA table.

Since the computations for the two-way ANOVA are tedious, this text will not cover performing the calculations by hand. Instead, we will concentrate on completing and interpreting the two-way ANOVA tables.

A farmer wants to see if there is a difference in the average height for two new strains of hemp plants. They believe there also may be some interaction with different soil types so they plant 5 hemp plants of each strain in 4 types of soil: sandy, clay, loam and silt. At \(\alpha\) = 0.01, analyze the data shown, using a two-way ANOVA as started below in Figure 11-6. See below for raw data.

Rough drawings from memory were futile. He didn't even know how long it had been, beyond Ford Prefect's rough guess at the time that it was "a couple of million years" and he simply didn't have the maths. Still, in the end he worked out a method which would at least produce a result. He decided not to mind the fact that with the extraordinary jumble of rules of thumb, wild approximations and arcane guesswork he was using he would be lucky to hit the right galaxy, he just went ahead and got a result. He would call it the right result. Who would know?

As it happened, through the myriad and unfathomable chances of fate, he got it exactly right, though he of course would never know that. He just went up to London and knocked on the appropriate door. "Oh. I thought you were going to phone me first."

(Adams, 2002)