8/10/2019 Linear Algebra Final Review
1/7
Linear Algebra Final ReviewChapter 1
1. Every matrix is row equivalent to a unique matrix in echelon form. False
because you can row reduce a matrix into a multitude of ways, row reducing it
to its pure RREF or just to a triangular reduction.
2.
Any system of n linear equations in n variables has at most n solutions. Falsebecause in an nxn matrix, there can be a row of zeros, indicating a free
variable; hence, there can exist infinitely many solutions to the system of
linear equations.
3. If a system of linear equations has two different solutions, it must haveinfinitely many solutions. True because if the system has more than one
unique solution, the system suggests that there is a free variable; hence, there
will exist infinitely many solutions.
4. If a system of linear equations has no free variables, then it has a uniquesolution. False; even when there exist no free variables, that doesnt mean the
complement is true; that is, there doesnt exist a unique solution nor are there
signs of basic variables. The system may be inconsistent. 5. If an augmented matrix [A b] is transformed into [C d] by elementary row
operations, then the equations Ax=b and Cx=d have exactly the samesolution. True because by definition and by the properties of row reduction, a
matrix that has been row reduced n-numb er of times maintains its row
equivalency with its previous matrix form; hence, the solution that may be
difficult to see from Ax=b is, nonetheless, the same as the solution one may
figure out by fully row reducing Ax=b into Cx=d.
6. If a system Ax=b has more than one solution, then so does the system Ax=0.True because if there exists more than one solution for the inhom ogenous
equation Ax=b, then, definitely, there exists a consistent system of equations
that has a free variable, letting Ax=0 have infinitely many solutions. 7. If A is an mxn matrix and the equation Ax=b is consistent for some b, then
the columns of A span Rm. False, the system must encompass all b for the
columns of A to span R
m
.
8. If an augmented matrix [A b] can be transformed by elementary row
operations into reduced echelon form, then the equation Ax=b is
consistent. False because when the matrix is in RREF, it doesnt
necessarily mean that it will be consistent. There could possiblly be a
row of zeros but on the augmented side there exists a value b where its
nonzero, making the system inconsistent. In other words, any m atrix can
be row redu ced into row reduced echelon form, however, not every
matrix/system of equations is always cons istent.
9.
If matrices A and B are row equivalent, they have the same reduced
row echelon form. True because A and B are row equ ivalent, hence,
their final row reduced state should be the sam e; the only difference
between A and B is that their initial row ma trix states are different.
8/10/2019 Linear Algebra Final Review
2/7
10.The equation Ax=0 has the trivial solution if and only if there are no
free variables. False; all homogenous equations have the trivial solution,
regardless of whether they have free variables.
11.If A is an mxn matrix and the equation Ax=b is consistent for EVERY b
in Rm, then A has m pivot columns. True because by definition of column
space, if Ax=b is consistent for all b in R
m
, then there are m pivot
columns.
12.If an mxn matrix A has a pivot position in every row, then the equation
Ax has a unique solution for each b in Rm. False, A has to have a pivot
position in every column for the equ ation Ax to have a unique solution
for each b in R
m
.
13.If an nxn square matrix A has n pivot positions, then the reduced
echelon form of A is the nxn identity matrix. True, by definition of a
square matrix of size nxn, w hereby there exists n pivot positions, then
the RREF of A will be equivalent to the identify matrix I
n
.
14.
If 3x3 matrices A and B each have three pivot positions, then A can betransformed into B by elementary row operations. True, if there are 3
pivot positions in a 3x3 square matrix, then there are 3 basic variables in
a consistent system of equations; hence, their RREFs will resemble the
same Identity matrix of size 3x3: I
3
.
15.If A is an mxn matrix, if the equation Ax=b has at least two different
solutions, and if the equation Ax=c is consistent, then the equation
Ax=c has many solutions. True because if Ax=b h as at least two
different solutions, then there exist free variables, suggesting that there
are infinitely many solutions. Hence, because Ax=c is consistent, then it,
too, has infinitely many solutions because Ax=b, an inhom ogeneous
equation has infinitely many solutions.
16.
If A and B are row equivalent mxn matrices and if the columns of A
span Rm, then so do the columns of B. True, its pretty self explanatory
because A and B are row equivalent mxn m atrices, which means that the
solutions of A and B are the same; hence, if the columns of A span R
m
,
then so do the columns of B.
17.If none of the vectors in the set S= {v1, v2, v3} in R3is a multiple of one
of the other vectors, then S is linearly independent. False, that is true
about the relationship between two vectors in R
2
, because we dont
know if one of the vectors may possibly be a linear combination of the
preceding vectors.
18.If {u,v,w} is linearly independent, then u, v, and w are not in R2. True; if
u,v, and w were in R
2
, then the set of vectors would be linearly
dependent since there are more vectors than equations in the system.
8/10/2019 Linear Algebra Final Review
3/7
19.In some cases, it is possible for four vectors to span R5. False; 5 vectors
are required for this to happen s ince it will take 5 vectors to have 5 pivot
positions to even span the vector space of R
5
.
20.If u and v are in Rm, then u is in Span{u,v}. True, because u is a linear
combination of -1*u + 0 *v.
21.
If u,v, and w are nonzero vectors in R2, then w is a linear combination of uand v. False, if u and v are multiples of each other, then span{u,v} is a line
where vector w doesnt span.
22.
If w is a linear combination of u and v in Rn, then u is a linear combination of
v and w. False because the linear dependenc e relations doesnt allow u to be a
linear combination of v and w since the statement goes true for span{u,v}. u
cannot be a linear combination of v and w , because its been established that u
and v are linearly independent in R
n
.
23.Suppose v1,v2,v3are in R5, v2is not a multiple of v1, and v3 is not a linearcombination of v1or v2. Then { v1,v2,v3} is linearly independent. False becausethere can be a zero vector which makes the entire system linearly dependent.
Also, if there are 3 vectors with 5 equations, there are infinitely many
solutions, making the system, again, linearly dependent.
24.A linear transformation is a function. True by definition of a linear
transformation; it takes a vector or polynomial and outputs some value. For
every input there is an output definition of a function).25.
If A is a 6x5 matrix, the linear transformation x |-> Ax cannot map R5 to R6.True, for x to Ax to map R
5
to R
6
, R
5
must have a pivot position in each row, but
since there are 5 rows, it needs 6 to map o nto R
6
, which is impossible.
26.If A is an mxn matrix with m pivot columns, then the linear transformation xto Ax is a one to one mapping. False, there must be a pivot in each column, butbecause were dealing with an m xn matrix, m might be less than n.
Chapter 2
1. If A and B are mxn, then both ABTand ATB are defined. True because the
column of the left matrix multiple is equivalent to the row dimension of the
right matrix multiple.
2. If AB=C and C has 2 columns, then A has 2 columns. False, B must have 2
columns, while A can have many columns. However, the number of rows in A
must be equivalent to the number of columns in B .
3. The transpose of an elementary matrix is an elementary matrix. True. Thetranspose of an elemenetary matrix is an elementary matrix of the same type.
4.
If A is a 3x3 matrix with three pivot positions, there exist elementary
matrices E1,,Ep, such that Ep.E1A=I.True, if A is 3x3 with three pivotpositions, then A is row equivalent to I_3.
5. If AB = I, then A is invertible.False, we must know whether A is a squarematrix.
6.
Of A amd N are square and invertible, the (AB)-1 = A-1B-1. False, AB)-1=B
-1
A
-1
.
Chapter 3
8/10/2019 Linear Algebra Final Review
4/7
1. If A is a 2x2 matrix with a zero determinant, then one column of A is amultiple of the other. True, because of the matrix has a determinant of zero,
its linearly dependent.
2. If two rows of a 3x3 matrix A are the same, then detA = 0. True because then
there will be a row of zeros, and if we calculate the cofactor expansion, we get
that detA = 0.
3. If A is a 3x3 matrix, then det5A = 5detA. False, det5A = 53detA.
4. If A and B are nxn matrices, with detA = 2 and detB = 3, then det(A+B)=5.False because detAB = detA*detB, and detA+detB = 5 is not related to the
addition of two m atrices A and B and then finding the determinant of their
additive result.
5.
If A is nxn and detA=2, then detA3=6. False, detA3=2 3= 8.
6. If B is produced by interchanging two rows of A, then detB = detA. False,
detB= -detA.7. If B is produced by multiple row 3 of A by 5, then detB = 5detA.True, by the
properties of determinants effect via row operations.
8.
If B is formed by adding to one row of A a linear combination of the otherrows, then detB = detA. True, by the definition of the row operation s affect onthe determinant.
9. detAT= -detA. False, detAT=detA.
10.det(-A) = -det(A). True. Test case.11.detATA >= 0. True because two matrices determinants squared will
automatically give you a positive result.
12.If u and v are in R2and det[u v] = 10, then the area of the riangle in the plan
with vertices at 0, u, and v is 10.False, its the area (calculateddeterminant) where the vertices are.
13.
If A
3
=0, then detA = 0. True, if you cube root the determinant A cubes, you still
get the same result: 0.14.If A is invertible, then det A -1= detA. False, detA-1= 1/detA.
15.If A is invertible, then detA) detA -1)=detA/detA = 1.Chapter 4
There exists a vector space V, and a set S = { v1,,vp}1. The set of all linear combinations of v1,,vpis a vector space. True because
the set is Span{v1,,vp} and every subspace is itself a vector space.
2. If { v1,,vp-1} spans V, then S spans V. True, any linear combination of v1,,vp-1is also a linear combination of v
1
,,v
p,
v
p
where p is the zero weight.
3. If {v1,,vp-1 } is linearly independent, then so is S. False because then {v1,,vp} is linearly dependent since it has one extra vector after v_p-1.
4.
If S is linearly independent, then S is a basis for V. False because if we let the
standard basis for R
3
play a role, then the first two v ectors in the span is a
linearly independent set but is not a basis for R
3
. In other words, it might to
span the vector space, yet it is linearly independent.
5.
If Span S = V, then some subset of S is a basis for V. True by the definition of abasis or Spanning Set T heorem.
8/10/2019 Linear Algebra Final Review
5/7
6. If dim V = p and Span S = V, then S cannot be linearly dependent. Truebecause Span S is in V where dim V = p, so by the basis theorem, S is a basis for
V because S spans V and has exactly p elements; hence S must be linearly
independent.
7.
A plane in R3, is a two dimensional subspace. False, because the plane must
pass through the origin to be a subspace.
8. The nonpivot columns of a matrix are always linearly dependent. False
because there can be linear dependence relations amon gst columns and there
can possible not be any linear dependence relations amonst them.
9. Row operations on a matrix A can change the linear dependence relationsamong the rows of A. True, only among the rows of A, not the columns of A.
10.
Row operation on a matrix can change the null space. False, the null space isthe solution to the equation Ax=0, which cannot be affect by row operations.
11.The rank of a matrix equals the number of nonzero rows. False, that is thedimension of the row space. What is being referred to is the dimension of the
Column Space.
12.
If an mxn matrix A s row equivalent to an echelon matrix U and if U has knonzero rows, then the dimension of the solution space of Ax=0 is m-k. False,the dimension of the solution space is n-k. rankA = k and dim Nul = n-k by the
rank theorem.
13.If B is obtained from a matrix A by several elementary row operations, thenrankB = rankA. True the rank of B and A are equivalent because they have the
same column spaces.
14.The nonzero rows of a matrix A form a basis for Row A.False, the nonzero
rows of A span Row A but they may not be linearly independent. Hence they
dont form a basis.
15.
If matrices A and B have the same reduced echelon form, then Row A = Row
B. True by the definition of a row space, the row reduced form o f matrices Aand B contain the same row spaces. The nonzero rows of the RREF E form a
basis for the row sp ace of each m atrix that is row equivalent to E.
16.If H is a subspace of R3then there is a 3x3 matrix A such that H = ColA. True,if H is the zero subspace, let A be the 3x3 zero matrix. If dimH = 1, let v be a
basis for H and set A = [v v v]. If dimH = 2 let {u,v} be a basis for H and set A =
[u v v]. If dimH = 3 let {u,v,w} be a basis for H and set A = [u v w].
17.If A is mxn and rank A = m, then the linear transformation x to Ax is one toone. False because the number of columns may be bigger or smaller than the
number of rows, hence in R
n
, the matrix is not one-to-one.18.
If A is mxn and the linear transformation x to Ax is onto, then rank A = m.
True because if x to Ax is onto, then Col A = R
m
and rank A = m.19.A change of coordinates matrix is always invertible. True because a change of
coordinates matrix is square and must be invertible to change coordinates;
otherwise, it wouldnt be called a change of coordinates matrix.
20.If B = {b1,,bn} and C = {c1,,cn} are bases for a vector space V, then the jthcolumn of the change of coordinates matrix P(C
8/10/2019 Linear Algebra Final Review
6/7
1. If A is invertible and 1 is an eigenvalue for A, then 1 is also an eigenvalue for
A-1. True, if A is invertible and if Ax=1*x for some nonzero x, then left multiple
by A
-1
to obtain x = A
-1
x. Since x is nonzero, this shows 1 is an eigenvalue of A
-1
.
2.
If A is row equivalent to the identity matrix I, then A is diagonalizable. False,
if A is row equivalent to the identity matrix, then A is invertible. The invertible
matrix wont be diagonalizable.
3. If A contains a row or column of zeros, then 0 is an eigenvalue of A. True if A
contains a row o r column of zeros, then A is not row equivalent to the identity
matrix and thus is not invertible. By IMT, 0 is an eigenvalue of A. 4. Each eigenvalue of A is also an eigenvalue of A2. False, if we consider a
diagonal matrix D whose eigenvalues are 1 and 3, that is, its diagonal entries
are 1 and 3, then D^2 is a diagonal matrix whose eigenvalues or diagonal
entries are 1 and 9. You just square the eigenvalues.5. Each eigenvector of A is also an eigenvector of A2. True, suppose a nonzero
vector x satisfies Ax = lambda*x, then A
2
x = A Ax) = A lambda*x) =
lambda*A*x = lambda*lambda*x. This shows that x is also an eigenvector for
A
2
.
6. Each eigenvector of an invertible matrix A is also an eigenvector of A-1. True,
lambda
-1
x=A
-1
x which shows that x is also an eigenvector of A
-1
.
7. Eigenvalues must be nonzero scalars.False, zero is an eigenvalue of eachsingular square matrix.
8. Eigenvectors must be nonzero vectors True, by definition, eigenvectors mustbe nonzero.
9. Two eigenvectors corresponding to the same eigenvalue are always linearly
dependent.False, let v be an eigenvector for A, then v and 2v are distinct
eigenvectors for the same eigenvalue but v and 2v and linearly dependent.
10.
Similar matrices always have exactly the same eigenvalues.True because thesimilarity theorem says so
11.Similar matrices always have exactly the same eigenvectors. False, if A is a
3x3 then A is similar to a diagonal matrix D, which its eigenvectors are the
columns of I_3, but the eigenvectors of A are entirely different.
12.The sum of two eigenvectors of a matrix A is also an eigenvector of A.False, if
we have a test case, we can show that a1 and a2 are eigenvectors, but a1+a2 is
not an eigenvector.
13.
The eigenvalues of an upper triangular matrix A are exactly the nonzero
entries on the diagonal of A.False; all the diagonal entries of an upper
triangular matrix are the eigenvalues of the matrix. A diagonal entry may be
zero.
14.The matrices A and A transpose have the same eigenvalues, counting
multiplicities.True because they both have the same characteristic
polynomial.
15.If a 5x5 matrix A has fewer than 5 distinct eigenvalues, then A is notdiagonalizable.False, because we can let A be the 5x5 identity matrix, so there
are eigenvalues, but each eigenvalue is 1.
8/10/2019 Linear Algebra Final Review
7/7
16.There exists a 2x2 matrix that has no eigenvectors in R2. True, for example ifwe let A be the matrix that rotates vectors through pi/2 radians about the
origin, then Ax is not a multiple of x when x is no nzero.
17.If A is diagonalizable then the columns of A are linearly independent.False, if
A is a diagonal matrix with 0 on the diagonal, then the columns of A are not
linearly independent, because the determinant is 0.18.A nonzero vector cannot correspond to two different eigenvalues of A. True,
if we try and diagonalize, then we get different eigenvalues, but because
theyre nonzero, they must equal each other. lambda_1x = lambda_2x)
lambda1-lambda2)x= 0, and so if x doesnt equal 0, then both eigenvalues are
equal.
19.
A square matrix A is invertible if and only if there is a coordinate system in
which the transformation x to Ax is represented by a diagonal matrix. Let A
be a singular matrix that is diagonalizable. Then the transformation x to Ax is
represented by a diagonal matrix relative to a coordinate system determined
by eigenvectors of A.
20.
If each vector e in the standard basis for Rnis an eigenvector or A, then A is adiagonal matrix. True by definition of matrix multiplication.
21.If A is similar to a diagonalizable matrix B, then A is also diagonalizable. True,
if B = PDP
-1
where D is a diagonal matrix, and if A = QBQ
-1
then A =
Q PDP
-1
)Q
-1
= QP)D PQ)
-1
, which shows that A is diagonalizable.
22.If A and B are invertible nxn matrices, then AB is similar to BA.True, since B
is invertible, AB is similar to B AB)B
-1
which equals BA.
23.An nxn matrix with n linearly independent eigenvectors is invertible.False,
having n linearly independent eigenvectors makes an nxn m atrix
diagonalizable but not necessarily invertible. One of the eigenvalues of the
matrix could be zero.
24.
If A is an nxn diagonalizable matrix, then each vector in Rncan be written as alinear combination of eigenvectors of A. True if A is diagonalizable then by thetheorem of diagnalization A has n linearly independent vectors v1 through vn
in R
n
. By the basis theorem, v1 through vn spans R
n
. This means that each
vector in R
n
can be w ritten as a linear combination of v1 through vn.
Top Related