# Calculus II/Vector operations

## Vector and its opposite vector

If there is a vector ${\displaystyle {\vec {B}}}$  then its opposite vector is ${\displaystyle -{\vec {B}}}$  . The opposite vector has the same magnitude as vector's magnitude but with opposite in direction . Opposite vector of vector B is ${\displaystyle -{\vec {B}}}$

## Sum and difference of 2 vectors

${\displaystyle {\vec {A}}+{\vec {B}}={\vec {A}}+{\vec {B}}}$
${\displaystyle {\vec {A}}-{\vec {B}}={\vec {A}}+(-{\vec {B}})}$

## Dot product of two vectors

### Definitioɲ

The scalar product or inner product or dot product of two vectors is defined as

${\displaystyle \mathbf {a} \cdot \mathbf {b} =|\mathbf {a} ||\mathbf {b} |\cos(\theta )}$

where ː${\displaystyle \theta \,}$  is the angle between the two vectors (see Figure 2(b)).

If ${\displaystyle \mathbf {a} \,}$  and ${\displaystyle \mathbf {b} \,}$  are perpendicular to each other, ${\displaystyle \theta =\pi /2\,}$  and ${\displaystyle \cos(\theta )=0\,}$ . Therefore, ${\displaystyle {\mathbf {a} }\cdot {\mathbf {b} }=0}$ .

The dot product therefore has the geometric interpretation as the length of the projection of ${\displaystyle \mathbf {a} \,}$  onto the unit vector ${\displaystyle {\hat {\mathbf {b} }}\,}$  when the two vectors are placed so that they start from the same point.

The scalar product leads to a scalar quantity and can also be written in component form (with respect to a given basis) as

${\displaystyle {\mathbf {a} }\cdot {\mathbf {b} }=a_{1}b_{1}+a_{2}b_{2}+a_{3}b_{3}=\sum _{i=1..3}a_{i}b_{i}~.}$

If the vector is ${\displaystyle n}$  dimensional, the dot product is written as

${\displaystyle {\mathbf {a} }\cdot {\mathbf {b} }=\sum _{i=1..n}a_{i}b_{i}~.}$

Using the Einstein summation convention, we can also write the scalar product as

${\displaystyle {\mathbf {a} }\cdot {\mathbf {b} }=a_{i}b_{i}~.}$

### Dot product identities

1. ${\displaystyle {\mathbf {a} }\cdot {\mathbf {b} }={\mathbf {b} }\cdot {\mathbf {a} }}$  (commutative law).
2. ${\displaystyle {\mathbf {a} }\cdot {(\mathbf {b} +\mathbf {c} )}={\mathbf {a} }\cdot {\mathbf {b} }+{\mathbf {a} }\cdot {\mathbf {c} }}$  (distributive law).

## Cross product of two vectors

### Definition

The vector product (or cross product) of two vectors ${\displaystyle \mathbf {a} \,}$  and ${\displaystyle \mathbf {b} \,}$  is another vector ${\displaystyle \mathbf {c} \,}$  defined as

${\displaystyle \mathbf {c} ={\mathbf {a} }\times {\mathbf {b} }=|\mathbf {a} ||\mathbf {b} |\sin(\theta ){\hat {\mathbf {c} }}}$

where ː${\displaystyle \theta \,}$  is the angle between ${\displaystyle \mathbf {a} \,}$  and ${\displaystyle \mathbf {b} \,}$ , and ${\displaystyle {\hat {\mathbf {c} }}\,}$  is a unit vector perpendicular to the plane containing ${\displaystyle \mathbf {a} \,}$  and ${\displaystyle \mathbf {b} \,}$  in the right-handed sense (see Figure 3 for a geometric interpretation)

In terms of the orthonormal basis ${\displaystyle (\mathbf {e} _{1},\mathbf {e} _{2},\mathbf {e} _{3})\,}$ , the cross product can be written in the form of a determinant

${\displaystyle {\mathbf {a} }\times {\mathbf {b} }={\begin{vmatrix}\mathbf {e} _{1}&\mathbf {e} _{2}&\mathbf {e} _{3}\\a_{1}&a_{2}&a_{3}\\b_{1}&b_{2}&b_{3}\end{vmatrix}}~.}$

In index notation, the cross product can be written as

${\displaystyle {\mathbf {a} }\times {\mathbf {b} }\equiv e_{ijk}a_{j}b_{k}~.}$

where ${\displaystyle e_{ijk}}$  is the Levi-Civita symbol (also called the permutation symbol, alternating tensor). This latter expression is easy to remember if you recognize that xyz, yzx, and zxy are "positive" and the others are negative: xzy, yxz, zyx.

If ${\displaystyle \mathbf {c} ={\mathbf {a} }\times {\mathbf {b} }}$ , then

${\displaystyle c_{x}=a_{y}b_{z}-a_{z}b_{y}}$
${\displaystyle c_{y}=a_{z}b_{x}-a_{x}b_{z}}$
${\displaystyle c_{z}=a_{x}b_{y}-a_{y}b_{z}}$

### Indentities

1. ${\displaystyle {\mathbf {a} }\times {\mathbf {b} }=-{\mathbf {b} }\times {\mathbf {a} }}$
2. ${\displaystyle {\mathbf {a} }\times {\mathbf {b} +\mathbf {c} }={\mathbf {a} }\times {\mathbf {b} }+{\mathbf {a} }\times {\mathbf {c} }}$
3. ${\displaystyle {\mathbf {a} }\times {({\mathbf {b} }\times {\mathbf {c} })}=\mathbf {b} ({\mathbf {a} }\cdot {\mathbf {c} })-\mathbf {c} ({\mathbf {a} }\cdot {\mathbf {b} })}$
4. ${\displaystyle {({\mathbf {a} }\times {\mathbf {b} })}\times {\mathbf {c} }=\mathbf {b} ({\mathbf {a} }\cdot {\mathbf {c} })-\mathbf {a} ({\mathbf {b} }\cdot {\mathbf {c} })~}$
5. ${\displaystyle {\mathbf {a} }\times {\mathbf {a} }=\mathbf {0} ~}$
6. ${\displaystyle {\mathbf {a} }\cdot {({\mathbf {a} }\times {\mathbf {b} })}={\mathbf {b} }\cdot {({\mathbf {a} }\times {\mathbf {b} })}=\mathbf {0} ~}$
7. ${\displaystyle {({\mathbf {a} }\times {\mathbf {b} })}\cdot {\mathbf {c} }={\mathbf {a} }\cdot {({\mathbf {b} }\times {\mathbf {c} })}~}$

The rest of this resource has been moved to Vector calculus.