A visual introduction to differential forms and calculus on manifolds

Chapter 1 Background material

1.1 Review of vector spaces

vector spaces

$u,v,w\in V$,$c,d \in \mathbb R$

  1. $v+w=w+v$
  2. $(u+v)+w=u+(v+w)$
  3. $v+0=0+v$
  4. $v+(-v)=0$
  5. $1\cdot v=v$
  6. $c\cdot (d\cdot v)=(c\cdot d)\cdot v$
  7. $c\cdot(v+w)=c\cdot v+c\cdot w$
  8. $(c+d)\cdot v=c\cdot v+d\cdot v$

Linear independent
if $a_1v_1+a_2v_2+\cdots a_nv_n=0$ for some scalars $a_1,a_2,\cdots a_n$, then $a_1,a_2,\cdots a_n=0$

Linear transformation/Linear map/Linear Operator
Transformation between vector spaces $T:\mathbb{R}^n\to\mathbb{R}^m$

  1. $T(v+w)=T(v)+T(w)$
  2. $T(c\cdot v)=c\cdot T(v)$
    $v,w\in \mathbb{R}^n,c\in\mathbb{R}$

Matrix
a rectangular array of numbers

Matrix multiplicationi

$$ c_{ij}= \sum_{k=1}^m a_{ik}b_{kj} $$

Matrix representation of a linear transformation
Suppose $T:\mathbb{R}^n\to\mathbb{R}^m$, let $e_1,e_2,\cdots e_n$be the standard bases of $\mathbb{R}^n$, and let \(\tilde{e}_1,\tilde{e}_2,\cdots \tilde{e}_m\) be the standard bases of $\mathbb{R}^m$, then for $i\le j\le n$, there are unique numbers $a_{ij}$ such that:

$$ T(e_j)=\sum_{i=1}^m a_{ij}\tilde{e}_i $$

Then the matrix representation of $T$ is given by the $m\times n$ matrix with entries $a_{ij}$

Linear functional
special cases of linear transformation, when codomain is $\mathbb R$, the set of all linear functional on $V$ is called a dual space of $V$, denoted as $V^$, $V^$ is itself a vector space, e.g. prove first principle

$S,T\in V^*$
$(S+T)(v)=S(v)+T(v)=(T+S)(v)$

Bases in dual space
$T_i(e_j)=\delta_{ij}\quad T_i\in V^*,e_j\in V$, then $T_i$ is said to be dual to vector $e_i$, usually denoted by $e^i$

visual form of dual space vectors

1.2 Volume and determinants

$$ \text{determinant}\Leftrightarrow \text{signed volume} $$

$D:\mathbb{R}^{n\times n}\to \mathbb{R}$

$M=[v_1,v_2,\dots,v_n]$, we denote determinant by $D(M)$or$D(v_1,v_2,\cdots,v_n)$

property 1 : $D(I)=0\quad I=[e_1,\dots,e_n]$
property 2 : $D(v_1,v_2,\dots,v_n)=0$ if $v_i=v_j$ for any $i\ne j$
property 3 : for any $j=i,\dots,n,c\in \mathbb{R}$

$$ \begin{aligned} D(v_1,\dots,v_{j-1},v+cw,v_{j+1},\dots,v_n)=&D(v_1,\dots,v_{j-1},v,v_{j+1}\dots,v_n)\\ &+cD(v_1,\dots,v_{j-1},w,v_{j+1},\dots,v_n) \end{aligned} $$

property A : $D$ is alernating

$$D(v_1,\dots,v_i,\dots,v_j,\dots,v_n)=-D(v_1,\dots,v_j,\dots,v_i,\dots,v_n)$$

proof skipped

property B : If the vectors are linear dependent, then $D(v_1,v_2,\dots,v_n)=0$

proof skipped

property C : Add a multiple of one vector to another does not change determinant

proof skipped

permutation
$\sigma\in S_n, \sigma:{1,\dots,n}\to {1,\dots,n}$ there are $n!$ elements in $S_n$

transposition: a permutation which only two elements are exchangedi, denoted as $\tau_{ij}$

parity : evenness or oddness of the number of transposition to transform a permutation into identity

sign of permutation : for permutation with even parity, $\operatorname{sgn}(\sigma)=1$, for permutation with odd parity $\operatorname{sgn}(\sigma)=-1$

property D : $E_{\sigma}=[e_{\sigma(1)},e_{\sigma(2)},\dots,e_{\sigma(n)}]$, $D(E_{\sigma})=\operatorname{sgn}(\sigma)$

proof by property A, tranposition of two elements will result in a multiplication of $(-1)$, thus $D(E_{\sigma})=\operatorname{sgn}(\sigma)D(I)=\operatorname{sgn}(\sigma)$

$$ D\left(\begin{bmatrix} a_{11} & a_{12} &\cdots& a_{1n}\\ a_{21} & a_{22} &\cdots& a_{2n}\\ \vdots & \vdots &\vdots& \vdots\\ a_{n1} & a_{n2} &\cdots& a_{nn} \end{bmatrix} \right)=\sum_{\sigma\in S_n}\operatorname{sgn}\prod_{i=1}^n a_{\sigma(i)i} $$