MATH1231 - Linear Algebra Bases

A basis is a set of linearly independent vectors, whose span contains every vector inside a given vector space.

Given a vector space's basis, every element of the vector space can be expressed uniquely as a linear combination of the basis vector.

Every vector space has a basis. And every basis (all the bases) of a vector space have n elements, where n is the dimension of the vector space.

## In Mathematical Language

In a Vector Space V, S = (s1, s2, …, sn) forms a Basis for V if:

1. If the Span ({s1, s2, …, sn}) = V
2. If (s1, s2, …, sn) are linearly independent

## Orthonormal Bases

If S is a basis for $V = \mathbb{R}^m$ then it is an Orthonormal Basis if the vectors are orthonormal (i.e. each is of unit length (1) and all are perpendicular).

i.e. if $|S_i| = 1 \text { for each } i \text { in } S \text { and if } S_i \cdot S_j = 0 (\text { for } i ≠ j)$

## Example

The vectors (0, 1) and (1, 0) are the basis for $\mathbb{R}^2$

## Determining Bases

If it's a spanning set then if there are m vectors in the set, where m is the dimension of V, they are all linearly independent, and the set is a basis for V.

Or in other words, if you have a set of m vectors, if they're all linearly independent then they must be a spanning set.

## Constructing Bases

If we have a spanning set with too many vectors, then we row reduce it to make it linearly
independent (by finding the leading columns).

If we have a set (NOT spanning) then to use as many as possible to form a basis we put them into a matrix, followed by the Identity matrix, and then row reduce until we find the linearly independent ones.

E.g.

(1)
\begin{align} \begin {pmatrix} \text { } 1 & 2 & 3 &\text { } 1 & 1 & 0 & 0 & 0 \\ \text { } 2 & 5 & 7 &\text { } 3 & 0 & 1 & 0 & 0 \\ \text { } 4 & 1 & 5 &-3 & 0 & 0 & 1 & 0 \\ -2 & 4 & 2 & \text { }6 & 0 & 0 & 0 & 1 \\ \end {pmatrix} \end{align}
page revision: 14, last edited: 12 Aug 2011 13:04