Math 21b final exam guide
Regular sections
a) The exam
∑
The final exam is on Friday, May 21 from
∑ Here is an approximate facsimile of the first page of the exam booklet:
S 2004
Name: _____________________________________________
Instructions:
·
This
exam booklet is only for students in the Bio/statistics section.
·
Print
your name in the line above and circle the time of your section.
·
Answer
each of the questions below in the space provided. If more space is needed,
use the back of the facing page or on the extra blank
pages at the end of this booklet. Please
direct the grader to the extra pages used.
·
Please
give justification for answers if you are not told otherwise.
·
Please
write neatly. Answers deemed illegible
by the grader will not receive credit.
·
No
calculators, computers or other electronic aids are allowed; nor are you
allowed to
refer to any written notes or source material; nor are
you allowed to communicate with other students.
Use only your brain and a pencil.
·
Each
of the problems counts for the same total number of points, so budget your
time for
each problem.
∑
Do
not detach pages from this exam booklet.
In agreeing to take this exam, you are
implicitly agreeing to act with fairness and honesty.
∑ There will probably be a mix of true/false problems, multiple choice problems, and problems of the sort that you worked for the homework assignments.
∑ The exam will cover the material in Chapters 1-3, 5, 6.1, 6.2, 7, 8.1, 9.1 and 9.2 in the text book, Linear Algebra and Applications. The exam will also cover the material in Otto Bretscher’s handout on non-linear systems and the material on probability and statistics from the Bio/statistics handouts and from the Schaum outline book.
∑ Advice for studying: There are plenty of answered problems in the text book and I strongly suggest that you work as many of these as you think necessary. I have supplied in a separate handout some problems (with answers) to test your facility with the material on probability and statistics.
∑ Old exams: Old exams will not be terribly useful for the two reasons. The exams from previous semesters will either test on material that we have not taught, or not test material that we have. In particular, this is the first year that there has been a Bio/statistics section for Math 21b. Second, the exam format used in previous years might not be the format we will use. In any event, some old exams are archived in Cabot Library.
∑ Of the topics covered, some are more important than others. Given below is a list of the linear algebra topics to guide your review towards the more central issues.
b) Central topics and skills in linear algebra.
∑
Be able
to write the matrix that corresponds to a linear system of equations.
∑
Be able
to find rref(A) given the matrix A.
∑
Be able to
solve A
=
by computing rref for
the augmented matrix, thus rref(A|
).
∑
Be able
to find the inverse of a square matrix A by doing rref(A|I) where I is the
identity matrix.
∑
Be able
to use rref(A) to determine whether A is invertible, or if not, what its kernel
is and what its image dimension is.
∑
Become
comfortable with the notions that underlie the formal definitions of the
following terms: linear transformation, linear subspace, the span of a set of
vectors, linear dependence and linear independence, invertibility,
orthogonality, kernel, image.
∑
Given a
set, {
1, . . . ,
k}, of vectors in Rn, be able to use the rref of the n-row/k-column matrix
whose j’th column is
j to determine if this set is linearly
independent.
∑
Be able
to find a basis for the kernel of a linear transformation.
∑
Be able
to find a basis for the image of linear transformation.
∑
Know how
to multiply matrices and also matrices against vectors. Know how these concepts respectively relate
to the composition of two linear transformations and the action of a linear
transformation.
∑
Know how
the kernel and image of the product, AB, of matrices A and B are related to
those of A and B.
∑
Understand
how rref(AB) relates to rref(A) and rref(B).
∑
Be able
to find the coordinates of a vector with respect to any given basis of Rn.
∑
Be able
to find the matrix of a linear transformation of Rn with respect to any given basis.
∑
Understand
the relations between the triangle inequality ( |
+
| ≤ |
| + |
|), the Cauchy-Schwarz inequality (|
·
| ≤ |
||
|), and the Pythagorean equality
(|
+
|2 = |
|2 + |
|2).
∑
Be able
to provide an orthonormal basis for a given linear subspace of Rn. Thus, understand
how to use the Gram-Schmidt procedure.
∑
Be able
to give a matrix for the orthogonal projection of Rn onto any given linear subspace.
∑
Be able
to work with the orthogonal complement of any given linear subspace in Rn.
∑
Recognize
that rotations are orthogonal transformations.
∑
Be able
to recognize an orthogonal transformation:
It preserves lengths. Such is the
case if and only if its matrix, A, has the property that |A
| = |
| for all vectors
. Equivalent
conditions: A-1 = AT.
Also, the columns of A form an orthonormal basis. Also, the rows of A form an orthonormal
basis.
∑
Remember
that the transpose of an orthogonal matrix is orthogonal, as is the product of
any two orthogonal matrices.
∑
Be able
to recognize symmetric and skew-symmetric matrices.
∑
Recognize
that the dot product is matrix product,
·
= xTy, where x and y on the right hand side of
the inequality are respectively viewed as an n ´ 1 and
1 ´ n matrix.
∑
Recognize
that kernel(A) = kernel(ATA).
∑
Be able
to find the least square solution of A
=
is
* º (ATA)-1AT
.
∑
Be able
to use least squares for data fitting:
Know how to find the best degree n polynomial that fits a collection {(xk,
yk)}1≤k≤N of data points.
∑
Know how
to compute the angle between two vectors from their length and dot
product: cos(q) =
·
/(|
| |
|).
∑
Know how
to compute the determinant of a square matrix.
∑
Know the
properties of the determinant: det(AB) =
det(A)det(B), det(AT) = det(A), det(A-1) = 1/det(A), det(SAS-1) = det(A).
∑
Know how
the determinant is affected when rows are switched, or columns are switched, or
when a multiple of one row is added to another, or a multiple of one column is
added to another.
∑
Know
that det(A) = 0 if and only if kernel(A) has dimension bigger than 1.
∑
Know
that trace(A) = A11+A22+···+Ann.
∑
Know the
characteristic polynomial, l ® Ã(l) =
det(lI – A) and understand its significance: if Ã(l) = 0, there is a
non-zero vector
such that A
= l
.
∑
Realize
that Ã(l) factors completely if one allows complex roots.
∑
Be able
to comfortably use complex numbers.
Thus, multiply them, add them, use the polar form a+ib = reiq
= r cosq + i r sinq.
∑
Understand
that the norm of |a+ib| is (a2 + b2)1/2 and be
comfortable with the operation of complex conjugation that changes z = a+ib to
= a-ib. In this regard, don’t forget that |z| = |
|
∑
Be
comfortable with the fact that |zw| = |z| |w| and that |z+w| ≤ |z| + |w|
and that these hold for any two complex numbers z and w.
∑
Understand
that if l is a root of Ã, then so is its complex
conjugate.
∑
Know
what an eigenvalue, eigenvector and an eigenspace are.
∑
Understand
the difference between the algebraic multiplicity of a root of the
characteristic polynomial and its geometric multiplicity as an eigenvalue of
A.
∑
Understand
that the kernel of A-lI is the eigenspace for the eigenvalue l.
∑
Understand
that if A has an eigenvalue with non-zero imaginary part, then some of the
entries of any corresponding non-zero eigenvector must have non-zero imaginary
part as well.
∑
Recognize
that a set of eigenvectors whose eigenvalues are distinct must be linearly
independent.
∑
Be able
to compute the powers of a matrix a diagonalizable matrix.
∑
Know the
formula for the determinant of A as the product of its eigenvalues, and that of
the trace of A as the sum of its eigenvalues.
∑
Know
that a linear dynamical system has the form
(t+1) = A
(t) where A is a square matrix. Know how to solve for
(t) in terms of
(0) in the case that A is diagonalizable.
∑
Recognize
that the origin is a stable solution of
(t+1) = A
(t) if and only if the norm of each of A’s eigenvalues has
absolute value that is strictly less than 1.
∑
Be able
to solve for the form of t ®
(t) in terms of
(0) when ![]()
= A
and A is diagonalizable.
∑
Know
that the solution to ![]()
= A
where
(t) is zero for all time is stable if and only if the real
part of each eigenvalue of A is strictly less than zero.
∑
Know the
definition of an equilibrium point for a non-linear dynamical system and the
criteria for it’s stability in terms of the matrix of partial derivative.
∑
Know how
to plot the null-clines and approximate trajectories for a non-linear dynamical
system on R2.
c) Central topics and skills in probability and
statistics
∑ Know the basic definitions: Sample space, probability function, conditional probabilities, random variables.
∑ Be able to compute the mean and standard deviation of a random variable
∑ Be able to compute the probabilities for a random variable given the probability function on the sample space.
∑ Recognize indepent and dependent events.
∑ Be able to recognize independence of random variables and compute correlation matrices.
∑ Know how to use Bayes’ theorem
∑ Be able to use conditional probabilities to write probabilities of random variables.
∑ Be familiar with the Bayesian approach to the inverse problem
∑ Know the Binomial probability function
∑ Know the Poisson probability function
∑ Know the elementary counting formula involving ratios of factorials.
∑ Be familiar with the notion of a characteristic function and how its derivatives can be used to compute means and standard deviations.
∑ Know what P-values are and how they are used.
∑ Know the Chebychev theorem and why it is the reason for the focus on means and standard deviations.
∑ Know how to find the most likely binomial probability function to predict the probability of a certain number of successes in many independent trials if one is told the average number of successes.
∑ Be comfortable with the uniform probability functions.
∑ Be comfortable with the exponential probability functions.
∑ Be comfortable with the Gaussian probability functions.
∑ Undertand how and when to use the Central Limit Theorem.
∑ Be able to use the Central Limit Theorem to estimate P-values of averages.
∑ Be comfortable using the Central Limit Theorem to test theoretical proposals about the spread of data.
∑ Understand when to use the Central Limit Theorem to improve on the Chebychev estimate.
∑
Know that
≤
when s > 0 and
r > √2 s.
∑
Know how to go from conditional
probabilities to
(t+1) = A
(t) and Markov matrices.
∑ Know the theorems about Markov matrices: Properties when the entries are positive
a) 1 is an eigenvalue and all other eigenvalues have absolute value less than 1.
b) The eigenspace for the eigenvalue 1 is one dimensional and has a unique vector with purely positive entries that sum to 1.
c) The entries of any eigenvector for any other eigenvalue sum to zero.
∑
Be able to compute the t ® ∞ limit of
(t+1) = A
(t) when A is a Markov matrix
∑ Be able to use the least squares technique to fit data to lines and to polynomials