Академический Документы
Профессиональный Документы
Культура Документы
Linear Algebra
Problem 1. Suppose A, B ∈ Rn×n are symmetric and skew-symmetric matrices respectively. Prove that
tr(AB) = 0, where tr denotes the trace of a matrix. (Hint: Look at properties of the trace.)
Solution:
Since A is symmetric and B is skew-symmetric it holds that A = AT and B = −B T . Then taking the
trace of their product we have
tr(AB) = tr((AB)T )
tr(AB) = tr(B T AT )
tr(AB) = tr(−BA)
tr(AB) = −tr(AB)
∴ tr(AB) = 0
k #» #» 2 = ( #»
v + uk 2
#» T ( #»
v + u) #»
v + u)
= #»
v T #»
v+u #»T u#» + #» #» + u
vT u #»T #»
v
= k #» #» 2 + 2h #» #»
2
v k2 + k uk 2 v , ui
Similarly,
k #» #» 2 = k #» #» 2 − 2h #» #»
2
v − uk 2 v k2 + k uk 2 v , ui
k #» #» 2 + k #» #» 2 = 2 k #» #» 2
2
v + uk 2 v − uk 2 v k2 + 2 k uk 2
Problem 3. Let A and B be n × n matrices satisfying A + B = AB. Show that AB = BA. (Hint: Try
adding the identity matrix to both sides.)
Solution:
A + B + I = AB + I
I = AB − A − B + I
1
Note that we can factor the right side above to get
I = (A − I)(B − I)
(A − I)(B − I) = (B − I)(A − I)
AB − A − B + I = BA − B − A + I
∴ AB = BA
Solution:
Start by defining C = A − B. Thus, we can write
xT Cx = 0
(y + z)T C + (y + z) = 0
y T Cy + z T Cz + y T Cz + z T Cy = 0
Now, notice that y T Cz = (y T Cz)T = z T C T y = z T Cy (since the quantity is scalar). So we can write:
y T Cy + z T Cz + 2y T Cz = 0
Since xT Cx = 0 ∀x,
2y T Cz = 0
y T Cz = 0
Note that the above must hold for all y and z. So, consider the vectors with only a 1 in the ith and j th
positions (all other elements zero). Call these ei and ej respectively. We can then see
eTi Cej = 0
[cij ] = 0
Where [cij ] is the ith row, jth column, element of C. We can repeat this for all rows and columns of C, and
therefore C = 0.
Problem 5. Consider a matrix A ∈ Rm×n which represents a linear map. Prove that Rank(A) +
N ullity(A) = n, where n denotes the number of columns of A.
Solution:
Let {y1 , y2 , . . . , yr } be a basis for R(A) and {z1 , z2 , . . . , zs } be a basis for N (A). Note that for each i ∈
{1, . . . , r}, there exists xi ∈ Rn such that yi = Axi . We claim that x1 , x2 , . . . , xr , z1 , z2 , . . . , zs form a basis
2
for Rn which proves r + s = n as desired. First, we prove that x1 , x2 , . . . , xr , z1 , z2 , . . . , zs are linearly
independent. Assume
α1 x1 + α2 x2 + · · · + αr xr + β1 z1 + β2 z2 + · · · + βs zs = 0 (1)
Note that zi ∈ N (A) thus Azj = 0. Also, Axi = yi . Thus, (2) becomes:
α1 y1 + α2 y2 + · · · + αr yr = 0
α1 = α2 = · · · = αr = 0 (3)
β1 z1 + β2 z2 + · · · + βs zs = 0 (4)
β1 = β2 = · · · = βs = 0 (5)
α1 = α2 = · · · = αr = β1 = β2 = · · · = βr = 0
Ax = γ1 y1 + · · · + γr yr = γ1 Ax1 + · · · + γr Axr
⇒ A(x − γ1 x1 − · · · − γr xr ) = 0
⇒ x − γ1 x1 − · · · − γr xr ∈ N (A)
⇒ x − γ1 x1 − · · · − γr xr = δ1 z1 + · · · + δs zs
⇒ x = γ1 x1 + · · · + γr xr + δ1 z1 + · · · + δs zs
⇒ x ∈ span(x1 , x2 , . . . , xr , z1 , z2 , . . . , zs )
Problem 6 (Matrix differentiation). Let A(t) be an n × n matrix, which depends on t, i.e. its elements are
functions of t:
[A(t)]ij = aij (t), for i, j = 1, . . . , n
The derivative Ȧ(t) of A(t) with respect to t, is also an n × n matrix and is defined by taking element-wise
derivatives:
[Ȧ(t)]ij = ȧij (t), for i, j = 1, . . . , n
a. For n × n differentiable matrices A1 (t), A2 (t) prove:
d
A1 (t)A2 (t) = Ȧ1 (t)A2 (t) + A1 (t)Ȧ2 (t)
dt
3
b. Using induction, prove that for n × n differentiable matrices A1 (t), A2 (t), ... , Ak (t), we have:
d
A1 (t)A2 (t)...Ak (t) = Ȧ1 (t)A2 (t)...Ak (t) + A1 (t)Ȧ2 (t)...Ak (t) + ... + A1 (t)A2 (t)...A˙k (t)
dt
c. For n × n differentiable matrix A(t) that is invertible for all t, show that:
d −1 ˙
A (t) = −A−1 (t)A(t)A −1
(t)
dt
Solution:
a. Let B = A1 A2 , C = Ȧ1 A2 , D = A1 Ȧ2 . We want to show B = C + D.
(1) (2)
Suppose aij = [A1 ]ij , aij = [A2 ]ij , bij = [B]ij , cij = [C]ij , dij = [D]ij . Then, according to the
definition of matrix multiplication, we have
n
(1) (2)
X
bij = aik akj
k=1
Problem 7 (Function spaces). Consider C([0, 1]) the set of continuous real valued functions over the interval
[0, 1]. It is known that C([0, 1]) is a vector space over the field R and can be equipped with the following
inner product: Z 1
< f, g >= f (x)g(x)dx
0
and it is also equipped with the norm kf k induced by the inner product.
4
a. Let f (x) = 2x and g(x) = x4 . Obviously, f, g ∈ C([0, 1]). Find the angle between f and g.
b. Is F, defined below, a valid norm on C([0, 1])? Prove your claim.
Note: F is well-defined because of the fact that a continous function over a closed interval has maximum value.
c. A countably infinite set of vectors in a vector space is called an independent set if every finite subset
of which is an independent set. Prove that the set of functions {1, x, x2 , x3 , ..., xn , ...} ⊂ C([0, 1]) is an
independent set.
Solution:
b. Yes, it is a valid norm. Obviously, F(f ) ∈ R. Now, we want to show that properties in the definition
of a norm hold.
1. |f (x)| ≥ 0 ⇒ F(f ) = maxx∈[0,1] |f (x)| ≥ 0.
2. Suppose a ∈ R then
F(af ) = max |af (x)| = |a| max |f (x)|
x∈[0,1] x∈[0,1]
3. Define f0 = maxx∈[0,1] f (x) = F(f ) and g0 = maxx∈[0,1] g(x) = F(g). We want to show that
F(f + g) ≤ F(f ) + F(g) or equivalently, maxx∈[0,1] f (x) + g(x) ≤ f0 + g0 .
Taking the maximum of both sides, we obtain: (note that the right hand side is constant)
c. Suppose A is a finite subset of {1, x, x2 , x3 , ..., xn , ...}. Then, there exists a n such that A ⊆ {1, x, x2 , x3 , ..., xn }.
Thus, it suffice to prove that for a given n, the set {1, x, x2 , x3 , ..., xn } is independent. After proving
this, since any subset of an independent set is also independent, it follows that A is independent. Thus,
we only need to show that ∀n ∈ N : {1, x, x2 , x3 , ..., xn } is independent. First, recall that:
dk 0 k>i
i
x = k! k=i (6)
dxk
i(i − 1)...(i − k + 1)xi−k k < i
5
Now, for a given n suppose a0 + a1 x + a2 x2 + ... + an xn = 0. We want to show that ak = 0 ∀k ∈
{0, 1, 2, ..., n}. For a given k, taking the k th derivative of both sides gives:
n
!
dk X
a i xi = 0
dxk i=0
(6) X
⇒ 0 + k!ak + i(i − 1)...(i − k + 1)ai xi−k = 0
i>k
x=0
⇒ k!ak = 0
⇒ ak = 0
Systems
Problem 8 (Linearization). Consider the ball and beam system depicted in Figure 1. The beam is made
to rotate in a vertical plane by applying a torque at the center of rotation and the ball is free to roll (with
one degree of freedom) along the beam. Let the moment of inertia of the beam be J, the mass and moment
of inertia of the ball be M and Jb , respectively, the radius of the ball be R, and the acceleration of gravity
be G. Choosing the beam angle θ and the ball position r as generalized position coordinates for the system,
the Lagrangian equations of motion are given by
Jb
0= + M r̈ + M G sin θ − M rθ̇2 (1)
R2
τ = M r2 + J + Jb θ̈ + 2M rṙθ̇ + M Gr cos θ
(2)
a. Assume state vector x = [x1 x2 x3 x4 ]T = [r ṙ θ θ̇]T and the input u = τ . Find the function F
described as the following:
F1 (x, u)
F2 (x, u)
F =
F3 (x, u)
F4 (x, u)
6
b. For an arbitrary reference trajectory (x? (t), u? (t)), linearize the system around (x? (t), u? (t)). Your
answer should be in the form ẋ(t) = A(t)x(t) + B(t)u(t), where A(t) and B(t) are in terms of
x?1 , x?2 , x?3 , x?4 , u? , Jb , R, M, J, G.
Solution:
x1 r ṙ x2
x2 ṙ r̈ r̈
x3 = θ ⇒ ẋ = θ̇ = x4
a. x =
x4 θ̇ θ̈ θ̈
From (1) and (2), solving for r̈ and θ̈ respectively, will result in the following expressions
M MG
r̈ = x x2 −
Jb 1 4 Jb
sin x3
M + R2 M+R 2
1 2M MG
θ̈ = 2 u− 2 x1 x2 x4 − 2 x1 cos x3
M x 1 + J + Jb M x1 + J + Jb M x1 + J + Jb
. Substituting in the previous expression, the final result is
x2 F1 (x, u)
M 2 M G
J x1 x4 − sin x3 F2 (x, u)
J
M + Rb2 M + Rb2
ẋ = =
x4 F3 (x, u)
1
u − M x22M x x x − M x2M+J+J G
x1 cos x3 F4 (x, u)
M x2 +J+Jb
1 +J+Jb 1 2 4
1 b 1
b. Using the Taylor series expansion for each of the functions Fi for i = 1, 2, 3, 4, rewrite in a matrix form
the expression ẋ = Ax + Bu, where A and B matrices are the following:
∂F1 (x,u) ∂F1 (x,u) ∂F1 (x,u) ∂F1 (x,u)
∂x1 ∂x2 ∂x3 ∂x4
∂F2 (x,u) ∂F2 (x,u) ∂F2 (x,u) ∂F2 (x,u)
A = ∂F3 (x,u) ∂F∂x
∂x1 2 ∂x3 ∂x4
3 (x,u) ∂F3 (x,u) ∂F3 (x,u)
∂x ∂x2 ∂x3 ∂x4
1
∂F4 (x,u) ∂F4 (x,u) ∂F4 (x,u) ∂F4 (x,u)
∂x1 ∂x2 ∂x3
x = x? ∂x4
u = u?
0 1 0 0
MJb x?4 2 0 − MG
J cos x?3 2M
J x?1 x?4
M + Rb2 M + Rb2
= M + R2 ,
0 0 0 1
−2M MG −2M
c M x? 2 x?1 x?4 M x? 2 x?1 sin x?3 M x? 2 x?1 x?2
1 +J+Jb 1 +J+Jb 1 +J+Jb
(−2M x? ? ? ?2 ? ? ? ? ? ?
2 x4 −M G cos x3 )(M x1 +J+Jb )+(2M x1 x2 x4 +M Gx1 cos x3 )(2M x1 )
where c = (M x? 2 2
1 +J+Jb )
∂F
1 (x,u)
0
∂F2∂u
(x,u)
∂u 0
B= ∂F3 (x,u) =
0
∂u
1
∂F4 (x,u)
= x?
∂u
x M x? 2
1 +J+Jb
u = u?
Problem 9 (State space representation). Consider a system with input u(t) and output y(t) which can be
described using the following set of differential equations:
z¨1 (t) = 2z1 (t) + z2 (t) + u̇(t)
z˙2 (t) = z˙1 (t) + z2 (t) + u(t)
y(t) = 5z˙1 (t)
7
a. Define the states of the system such that it can be represented as an 3-dimensional LTI system, i.e.,
as the following:
b. Consider T defined below, as a new basis for the state space and let x̂(t) be the representation of x(t)
with respect to the basis T
1 1 1
T = 0 , 1 , 1
1 0 1
Compute Â, B̂, Ĉ, D̂ in the new representation of the system with respect to T :
˙
x̂(t) = Âx̂(t) + B̂u(t)
y(t) = Ĉ x̂(t) + D̂u(t)
Solution:
x1 = z˙1 − u
x2 = z1
x3 = z2
Thus,
Therefore,
x˙1 0 2 1 x1 0
ẋ = x˙2 = 1 0 0 x2 + 1 u = Ax + Bu
x˙3 1 0 1 x3 2
x1
y = [5 0 0] x2 + 5u = Cx + Du
x3
8
˙ Replacing this in the matrix representation of the system gives:
Thus, x = P x̂. It follows that ẋ = P x̂.
P x̂˙ = AP x̂ + Bu
y = CP x̂ + Du
x̂˙ = P −1 AP x̂ + P −1 Bu
y = CP x̂ + Du
Therefore,
0 1 2
−1
 = P AP = −1 1 1
2 0 0
−1
B̂ = P −1 B = −2
3
Ĉ = CP = [5 5 5]
D̂ = D
Problem 10 (Properties of system). In the following cases, a description of a system is provided where u(t)
is the input and y(t) is the output of system. In each case, determine wether the system is linear. If it is,
prove additivity and homogeneity, if not, give a counterexample for additivity or homogeneity.
a.
ÿ + y = ü + u̇ + u, y(0) = 0, ẏ(0) = 0
b.
y(t) = sup u(t)
0≤t≤1
c.
y(t) = min{1, u(t)}
d.
y[n] = 2Re{u[n]}
where u[n] is a discrete complex signal, analyzed as u[n] = µ[n] + jσ[n] with µ[n], σ[n] real signals. For
the general case of a linear system, the homogeneity property needs to hold true for any value α ∈ R
or C.
Solution:
In all the following, let y1 be the output of u1 and y2 the output of u2 . Also assume y0 = y1 + y2 and
u0 = u1 + u2 .
a. Linear.
Proof for additivity. We know that:
(ÿ1 + ÿ2 ) + (y1 + y2 ) = (ü1 + ü2 ) + (u̇1 + u̇2 ) + (u1 + u2 ), y1 (0) + y2 (0) = 0, ẏ1 (0) + ẏ2 (0) = 0
9
Thus,
ÿ0 + y0 = ü0 + u̇0 + u0 , y0 (0) = 0, ẏ0 (0) = 0
Now, according to the uniqueness theorem for solution of a differential equation, y0 is the output of
u0 .
Proof for homogeneity. Assume a ∈ R is given and let y3 = ay1 . We know that:
which means:
ÿ3 + y3 = ü3 + u̇3 + u3 , y3 (0) = 0, ẏ3 (0) = 0
Note that according to the uniqueness theorem for solution of a differential equation, y3 is the output
of u3 .
b. Not linear.
Suppose u1 (t) = t and u2 (t) = 1 − t, then, u0 (t) = 1. Also,
y0 = y1 + y2 = 2
sup u0 (t) = sup 1 = 1
0≤t≤1 0≤t≤1
Thus,
y0 = 2 6= 1 = sup u0 (t)
0≤t≤1
3 3
y1 (t) = y2 (t) = min{1, } =
4 4
3 3 3
y0 (t) = y1 (t) + y2 (t) =
+ =
4 4 2
3
min{1, u0 (t)} = min{1, } = 1
2
Thus,
3
y0 (t) = 6= 1 = min{1, u0 (t)}
2
Therefore, y0 is not the output of u0 .
d. Not linear.
Suppose α = j. Then we have
10