Study guide of topics covered ST 705 Monahan textbook chapters covered: 0 - 6 definition of a vector space; linearly dependent vectors and linearly independent vectors; spanning set of vectors; basis of vectors; subspaces; linear transformations; null space and range space of a linear transformation; dimension of a vector space; rank of a linear transformation; relationship between linear transformations and matrix multiplication; dimension theorem; matrix multiplication; properties of matrices; trace of a matrix; determinant of a matrix; eigenvalues of a matrix; eigenvectors of a matrix; characteristic polynomial of a matrix; spectral theorem (for finite dimensional vector spaces); diagonalizability; simultaneous diagonalizability; inner product and inner product spaces; vector and matrix norms and induced norms; triangle inequality; Cauchy-Schwarz inequality; definition of orthogonality; general linear model; vector and matrix derivatives; sum-of-squared error in the general linear model; least squares solution; the normal equations; results about null spaces and column spaces of design matrices; least squares predictions and properties; geometry of least squares solutions; generalized inverses; Moore-Pensrose conditions; Moore-Penrose pseudo-inverse; singular value decomposition; projection matrices and properties; orthogonal projection matrices and properties; orthogonal decompositions; orthogonal subspaces; projection onto the column space of a design matrix; various expressions for the set of least squares solutions; reparameterizations; confounding variables; orthonormal basis; Gram-Schmidt orthonormalization process; definition of an unbiased estimator; definition of a linear estimator; definition of a linearly estimable function; properties of linearly estimable functions; subspace of linearly estimable functions; methods to determine if a function is estimable; linear estimates of reparameterized general linear models; identifiability of parameters; further matrix decompositions; imposing conditions for a unique solution to the normal equations; linear constraints, estimability of constraints, and properties; constrained parameter spaces; properties of constrained estimation; estimability in a restricted model; restricted normal equations; Gauss-Markov model/assumptions; properties of expectation and covariance; Gauss-Markov theorem and the best linear unbiased estimator (BLUE); variance estimation; misspecification: under-fitting and biases; misspecification: over-fitting and biases; mean squared error; Aitken model and theorem; generalized least squares; multivariate generalization of a univariate distribution; multivariate normal distribution and equivalent definitions; multivariate moment generating function; joint normality and independence; definition of a chi-squared random variable; properties of the chi-squared distribution; noncentral chi-squared distribution and properties; construction of the F distribution from a ratio of independent chi-squares; construction of the student T distribution; distributions of quadratic forms of multvariate Gaussian random vectors; Cochran's theorem; sufficient and minimal sufficient statistics; complete sufficient statistics; minimum variance unbiased estimator (MVUE) in the Gaussian linear model; maximum likelihood estimator (MLE) in the Gaussian linear model; definition of a general linear hypothesis; construction of F-tests for general linear hypotheses; likelihood ratio test (LRT); equivalence between the LRT and the F-test for general linear hypotheses; confidence intervals and multiple comparisons (Bonferroni, Sheffe, Tukey);