To be distributed in class on Thursday 1/7. Click here for the postscript file.
Edwards and Penney, section 1.6) 31, 32, 52.
Chapter 1 review problems 1,2,3,4. (Try to do these without looking
at the answers in the back of the book.)
Section 2.1) 10, 19.
Section 2.2) 16.
Note: you might want to read sections 1.1 through 2.2 in the book,
since this roughly corresponds to what we have done in class so far. You can
read further in chapter 2 if you are interested but we will not do
much of this material in class. Next week we will start on chapter 3.
This assignment is a little longer and more theoretical than the first
two. Good luck! By the way, the answers to the exercises in the back
of Edwards and Penney aren't always right, so watch out.
1) Let L:V-->W and M:W-->X be linear operators. Prove that the composition
ML:V-->X
defined by
(ML)(v)=M(L(v))
is a linear operator. (Use the definition of linear operator I gave
in lecture.)
2) If L:V-->W is a linear operator, prove that Ker(L) is a subspace of
V, and Im(L) is a subspace of W. (Use the definition of subspace I
gave in lecture.)
3) Edwards and Penney section 3.2: 14,22,27,31,32.
4) Edwards and Penney section 3.3: 21,22,23,24,28.
5) Use Euler's formula e^{it}=cos(t)+isin(t) to derive the angle
addition formulas expressing cos(a+b) and sin(a+b) in terms of cos(a),
sin(a), cos(b), and sin(b).
6) Compute (1+i)^{20} without using a calculator. Hint: express 1+i=re^{it}.
7) Prove that de^{rx}/dx=re^{rx}. (Use the power series definition.)
8) How long did this assignment take? What was most difficult?
1)
Recall from the last homework that d/dx(e^{rx})=re^{rx}. Use this to
show that:
a) If x and y are real numbers, then e^{x+y}=e^x e^y. Hint: think of
x as a constant, and show that both sides of the equation, regarded as
functions of y, satisfy the differential equation df/dy=f. Then apply
the uniqueness theorem for solutions to differential equations (E&P,
page 143). (What else do you need to check to apply this theorem?).
b) If t is a real number, then e^{it}=cos(t)+i sin(t). Hint: show
that both sides of the equation, regarded as complex functions of t,
satisfy the differential equation df/dt=if. Then apply the uniqueness
theorem. (It works equally well for complex as for real functions.)
2) Are the functions 1+2x+3x^2, 2+3x+5x^2, and 11+14x+25x^2 linearly
independent? Justify your answer.
3) Prove that if r_1, r_2, ... , r_n are distinct real numbers, then
the functions e^{r_1 x}, e^{r_2 x}, ... , e^{r_n x} are linearly
independent. Hint: suppose they are not linearly independent,
i.e. suppose there are constants c_1, ... , c_n, not all equal to
zero, with
c_1 e^{r_1 x} + ... + c_n e^{r_n x} = 0
for all x. We will try to get a contradiction. By relabeling, you
can assume that r_1 < r_2 < ... < r_n. You can also assume all the
c_i's are nonzero. (Just throw away the terms where some c_i is
zero.) Now choose x sufficiently large that the term c_n e^{r_n x} is
bigger in absolute value than the sum of all the other terms in the
equation, so their sum cannot equal zero for this particular x, giving
a contradiction. x will be sufficiently large if
|c_n| e^{r_n x} > (|c_1|+...+|c_{n-1}|) e^{r_{n-1} x}
for all i < n. (Why? And how do we find x satisfying this inequality?)
4) E&P section 3.4: 14, 27
5) E&P section 3.5: 4, 34, 36, 48.
1) By the fundamental theorem of algebra, we can factor
det(A-tI) = (t_1-t)(t_2-t)...(t_n-t)
where t_1,...,t_n are the eigenvalues of A (some of which may be
repeated). Use this to prove that the determinant of a matrix equals
the product of the eigenvalues.
2) Assume the following facts:
Prove the following:
(a) A matrix with two equal rows has determinant zero. Hint: cleverly
use the last of the above facts.
(b) Subtracting a multiple of one row from another doesn't change the
determinant. Hint: use multilinearity and (a).
(c) The determinant of an upper triangular matrix is equal to the
product of the diagonal entries. Hint: use (b) repeatedly to reduce
to the easier case of a diagonal matrix. (A matrix (A_ij) is called
"upper triangular" if A_ij=0 whenever i>j.)
3) E&P section 5.2, problems 6,10,12,26,36.
4) (a) Show that the sum of the eigenvalues of a 2 by 2 matrix is
equal to the sum of the diagonal entries of the matrix.
(b) Extra credit: Does this generalize to n by n matrices? Why/ why
not? (The sum of the diagonal entries of a matrix is called the
"trace".)
(These facts can be proved using the permutation formula for the determinant.
They can also be taken as the "definition" of determinant: one can
abstractly prove that there exists a unique notion of determinant
satisfying these properties.)
E&P section 7.1 problem 1.
Section 7.2 problems 12, 22, 33. (For problem 33, use whatever method you
feel is most appropriate.)
Section 7.3 problems 16, 30. (For more partial fractions techniques,
see page 418.)
Section 7.4 problems 7, 20, 34.
Section 7.5 problem 7.
Section 7.6 problem 2. Also, draw a graph of the solution.
Last updated: Mar. 8, 1999.