Vector Spaces

Last updated: December 2024

Disclaimer: These are my personal notes compiled for my own reference and learning. They may contain errors, incomplete information, or personal interpretations. While I strive for accuracy, these notes are not peer-reviewed and should not be considered authoritative sources. Please consult official textbooks, research papers, or other reliable sources for academic or professional purposes.

1. Definition of Vector Space

A vector space $V$ over a field $\mathbb{F}$ is a set with two operations:

These operations must satisfy 10 axioms (associativity, commutativity, identity, etc.).

2. Examples of Vector Spaces

3. Subspaces

A subset $W$ of vector space $V$ is a subspace if:

4. Linear Independence

Vectors $\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_n$ are linearly independent if:

$$c_1\mathbf{v}_1 + c_2\mathbf{v}_2 + \cdots + c_n\mathbf{v}_n = \mathbf{0} \implies c_1 = c_2 = \cdots = c_n = 0$$

Otherwise, they are linearly dependent.

5. Spanning Sets

The span of vectors $\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_n$ is:

$$\text{span}\{\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_n\} = \{c_1\mathbf{v}_1 + c_2\mathbf{v}_2 + \cdots + c_n\mathbf{v}_n : c_i \in \mathbb{F}\}$$

6. Basis

A basis for vector space $V$ is a linearly independent spanning set.

Properties:

7. Dimension

The dimension of a vector space $V$, denoted $\dim(V)$, is the number of vectors in any basis.

8. Coordinates

For basis $B = \{\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_n\}$, the coordinate vector of $\mathbf{v}$ is:

$$[\mathbf{v}]_B = \begin{bmatrix} c_1 \\ c_2 \\ \vdots \\ c_n \end{bmatrix}$$

where $\mathbf{v} = c_1\mathbf{v}_1 + c_2\mathbf{v}_2 + \cdots + c_n\mathbf{v}_n$.

9. Linear Transformations

A function $T: V \to W$ is linear if:

For finite-dimensional spaces, $T$ can be represented by a matrix.

10. Kernel and Image

For linear transformation $T: V \to W$:

Rank-Nullity Theorem:

$$\dim(\ker(T)) + \dim(\text{im}(T)) = \dim(V)$$

11. Inner Product Spaces

An inner product on $V$ is a function $\langle \cdot, \cdot \rangle : V \times V \to \mathbb{F}$ satisfying:

12. Orthogonality

Vectors $\mathbf{u}$ and $\mathbf{v}$ are orthogonal if $\langle \mathbf{u}, \mathbf{v} \rangle = 0$.

Orthogonal Projection:

$$\text{proj}_{\mathbf{v}}(\mathbf{u}) = \frac{\langle \mathbf{u}, \mathbf{v} \rangle}{\langle \mathbf{v}, \mathbf{v} \rangle}\mathbf{v}$$

13. Gram-Schmidt Process

To orthogonalize vectors $\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_n$:

$$\mathbf{u}_1 = \mathbf{v}_1$$ $$\mathbf{u}_2 = \mathbf{v}_2 - \text{proj}_{\mathbf{u}_1}(\mathbf{v}_2)$$ $$\mathbf{u}_3 = \mathbf{v}_3 - \text{proj}_{\mathbf{u}_1}(\mathbf{v}_3) - \text{proj}_{\mathbf{u}_2}(\mathbf{v}_3)$$ $$\vdots$$

14. Code Example

# Python code for vector spaces
import numpy as np
from scipy import linalg

# Check linear independence
def is_linearly_independent(vectors):
    matrix = np.array(vectors).T
    rank = np.linalg.matrix_rank(matrix)
    return rank == len(vectors)

# Find basis
def find_basis(vectors):
    matrix = np.array(vectors).T
    rank = np.linalg.matrix_rank(matrix)
    return matrix[:, :rank]

# Gram-Schmidt orthogonalization
def gram_schmidt(vectors):
    basis = []
    for v in vectors:
        w = v.copy()
        for u in basis:
            w -= np.dot(v, u) / np.dot(u, u) * u
        if np.linalg.norm(w) > 1e-10:
            w = w / np.linalg.norm(w)
            basis.append(w)
    return basis

# Example
v1 = np.array([1, 1, 0])
v2 = np.array([1, 0, 1])
v3 = np.array([0, 1, 1])

vectors = [v1, v2, v3]
print(f"Linearly independent: {is_linearly_independent(vectors)}")

orthogonal_basis = gram_schmidt(vectors)
print("Orthogonal basis:")
for i, v in enumerate(orthogonal_basis):
    print(f"u{i+1} = {v}")

15. References