Engineering Analysis/Linear Independence and Basis

< Engineering Analysis

Linear Independance

A set of vectors V = {v_1, v_2, \cdots, v_n} are said to be linearly dependant on one another if any vector v from the set can be constructed from a linear combination of the other vectors in the set. Given the following linear equation:

a_1v_1 + a_2v_2 + \cdots + a_nv_n = 0

The set of vectors V is linearly independent only if all the a coefficients are zero. If we combine the v vectors together into a single row vector:

\hat{V} = [v_1 v_2 \cdots v_n]

And we combine all the a coefficients into a single column vector:

\hat{a} = [a_1 a_2 \cdots a_n]^T

We have the following linear equation:

\hat{V}\hat{a} = 0

We can show that this equation can only be satisifed for \hat{a} = 0, the matrix \hat{V} must be invertable:

\hat{V}^{-1}\hat{V}\hat{a} = \hat{V}^{-1}0
\hat{a} = 0

Remember that for the matrix to be invertable, the determinate must be non-zero.

Non-Square Matrix V

If the matrix \hat{V} is not square, then the determinate can not be taken, and therefore the matrix is not invertable. To solve this problem, we can premultiply by the transpose matrix:

\hat{V}^T\hat{V}\hat{a} = 0

And then the square matrix \hat{V}^T\hat{V} must be invertable:

(\hat{V}^T\hat{V})^{-1}\hat{V}^T\hat{V}\hat{a} = 0
\hat{a} = 0

Rank

The rank of a matrix is the largest number of linearly independent rows or columns in the matrix.

To determine the Rank, typically the matrix is reduced to row-echelon form. From the reduced form, the number of non-zero rows, or the number of non-zero columns (whichever is smaller) is the rank of the matrix.

If we multiply two matrices A and B, and the result is C:

AB = C

Then the rank of C is the minimum value between the ranks A and B:

\operatorname{Rank}(C) = \operatorname{min}[\operatorname{Rank}(A), \operatorname{Rank}(B)]

Span

A Span of a set of vectors V is the set of all vectors that can be created by a linear combination of the vectors.

Basis

A basis is a set of linearly-independent vectors that span the entire vector space.

Basis Expansion

If we have a vector y \in V, and V has basis vectors {v_1 v_2 \cdots v_n}, by definition, we can write y in terms of a linear combination of the basis vectors:

a_1v_1 + a_2v_2 + \cdots + a_nv_n = y

or

\hat{V}\hat{a} = y

If \hat{V} is invertable, the answer is apparent, but if \hat{V} is not invertable, then we can perform the following technique:

\hat{V}^T\hat{V}\hat{a} = \hat{V}^Ty
\hat{a} = (\hat{V}^T\hat{V})^{-1}\hat{V}^Ty

And we call the quantity (\hat{V}^T\hat{V})^{-1}\hat{V}^T the left-pseudoinverse of \hat{V}.

Change of Basis

Frequently, it is useful to change the basis vectors to a different set of vectors that span the set, but have different properties. If we have a space V, with basis vectors \hat{V} and a vector in V called x, we can use the new basis vectors \hat{W} to represent x:

x = \sum_{i = 0}^na_iv_i = \sum_{j = 1}^n b_jw_j

or,

x = \hat{V}\hat{a} = \hat{W}\hat{b}

If V is invertable, then the solution to this problem is simple.

Grahm-Schmidt Orthogonalization

If we have a set of basis vectors that are not orthogonal, we can use a process known as orthogonalization to produce a new set of basis vectors for the same space that are orthogonal:

Given: \hat{V} = {x_1 v_2 \cdots v_n}
Find the new basis \hat{W} = {w_1 w_2 \cdots w_n}
Such that \langle w_i, w_j\rangle = 0\quad\forall i, j

We can define the vectors as follows:

  1. w_1 = v_1
  2. w_m = v_m - \sum_{i = 1}^{m-1}\frac{\langle v_m, u_i\rangle }{\langle u_i, u_i\rangle }u_i

Notice that the vectors produced by this technique are orthogonal to each other, but they are not necessarily orthonormal. To make the w vectors orthonormal, you must divide each one by its norm:

\bar{w} = \frac{w}{\|w\|}

Reciprocal Basis

A Reciprocal basis is a special type of basis that is related to the original basis. The reciprocal basis \hat{W} can be defined as:

\hat{W} = [\hat{V}^T]^{-1}
This article is issued from Wikibooks. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.