Math S21a: Multivariable calculus Oliver Knill, Summer 20112: Vectors and Dot ProductTwo points P = (a, b, c) and Q = (x, y, z) in space define a vector ~v = hx − a, y −b − z − ci. It points from P to Q and we write also ~v =~P Q. The real numbersnumbers p, q, r in a vector ~v = hp, q, ri are called the components of ~v.Vectors can be drawn everywhere in space but two vectors with the same components areconsidered equal. Vectors can be translated into each other if and only if their components arethe same. If a vector starts at the origin O = (0, 0, 0), then the vector ~v = hp, q, ri point s to thepoint (p, q, r). One can therefore identify points P = (a, b, c ) with vectors ~v = ha, b, ci att ached tothe origin. To make more clear which objects are vectors, we sometimes draw an arrow on top ofit and if ~v =~P Q then P is the ”tail” and Q is the ”head” of the vector. To distinguish vectorsfrom points, it is custom to different brackets and write h2, 3, 4i for vectors and (2, 3, 4) for points.The sum of two vectors is ~u + ~v = hu1, u2i + hv1, v2i = hu1+ v1, u2+ v2i. Thescalar multiple λ~u = λhu1, u2i = hλu1, λu2i. The difference ~u − ~v can best beseen as the addition of ~u and (−1) ·~v.The vectors~i = h1, 0i,~j = h0, 1i are called standard basis vectors in the plane.In space, one has the basis vectors~i = h1, 0, 0i,~j = h0, 1, 0i,~k = h0, 0, 1i.Every vector ~v = hp, qi in the plane can b e written as a combination ~v = p~i + q~j of standard basisvectors and every vector ~v = hp, q, ri in space can be written as ~v = p~i + q~j + r~k. Vectors a reabundant in applications. They appear in mechanics: if ~r(t) = hf(t), g(t)i is a point in the planewhich depends on time t, then ~v = hf′(t), g′(t)i will be called the velocity vector at ~r(t). Heref′(t), g′(t) ar e the derivatives. In physics, we often want to determine forces acting on objects.Forces are represented as vectors. In particular, electromagnetic or gravitational fields or velocityfields in fluids are described by vectors. Vectors appear also in computer science: the scalablevector graphics is a standard for the web for describing two-dimensional graphics. In quantumcomputation, rather than working with bits, one deals with qbits, which are vectors. Finally,color can be written as a vector ~v = hr, g, bi, where r is red, g is green and b is blue componentof the color vector. An other coordinate system for color is ~v = hc, m, yi = h1 − r, 1 − g, 1 − bi,where c is cyan, m is magenta and y is yellow. Vectors appear in probability theory and statis-tics. On a finite probability space, a random variable is a vector.The addition and scalar multiplication of vectors satisfy the laws you know from arithmetic.commut at ivity ~u +~v = ~v + ~u, associativity ~u+ (~v + ~w) = (~u + ~v) + ~w and r ∗(s ∗~v) = (r ∗s) ∗~vas well as distributivity (r+s)~v = ~v(r+s) and r(~v+ ~w) = r~v+r ~w, where ∗ denotes multiplicationwith a scalar.The length |~v| of a vector ~v =~P Q is defined as t he distance d(P, Q) from P to Q.A vector of length 1 is called a unit vector. If ~v 6=~0, then ~v/|~v| is a unit vector.1 |h3, 4i| = 5 and |h3, 4, 12i| = 13. Examples of unit vectors are |~i| = |~j| =~k| = 1 andh3/5, 4/5i and h3/13, 4/13, 12/13i. The only vector o f length 0 is the zero vector |~0| = 0.The dot product of two vectors ~v = ha, b, c i and ~w = hp, q, ri is defined as ~v · ~w =ap + bq + cr.Remarks.a) Different notations for the dot product are used in different mathematical fields. while puremathematicians write ~v · ~w = (~v, ~w), one can see h~v |~wi in quantum mecha nics or viwior mo r egenerally gijviwjin general relativity. The dot product is also called scalar product or innerpro duct .b) Any product g(v, w) which is linear in v and w and satisfies the symmetry g(v, w) = g(w, v)and g(v, v) ≥ 0 and g(v, v) = 0 if and only if v = 0 can be used as a dot pr oduct. An example isg(v, w) = 3v1w1+ 2v2w2+ v3w3.The dot product determines distance and distance determines the dot product.Proof: Lets write v = ~v in this proof. Using the dot product one can express the length of v as|v| =√v · v. On the other ha nd, from (v + w) ·(v + w) = v ·v + w ·w + 2(v · w) can be solved forv · w:v · w = (|v + w|2− |v|2− |w|2)/2 .The Cauchy-Schwarz inequality tells |~v · ~w| ≤ |~v||~w|.Proof. We can assume |w| = 1 after scaling the equation. Now plug in a = v ·w into the equation0 ≤ (v−aw)·(v−aw) to get 0 ≤ (v−(v·w)w)·(v−(v·w)w) = |v|2+(v·w)2−2(v·w)2= |v|2−(v·w)2which means (v · w)2≤ | v|2.Having established this, we have a clean definition of what an angle is:The angle between two nonzero vectors is defined as the unique α ∈ [0, π] whichsatisfies ~v · ~w = |~v| · |~w|cos(α).Al Kashi’s theorem: If a, b, c are the side lengths of a tr iangle ABC and α is theangle opposite to c, then a2+ b2= c2− 2ab cos(α).Pro of. Define ~v =~AB, ~w =~AC. Because c2= |~v − ~w|2= (~v − ~w) ·(~v − ~w) = |~v|2+ |~w|2− 2~v · ~w,We know ~v · ~w = |~v| · |~w|cos(α) so that c2= |~v|2+ |~w|2− 2|~v| · |~w|cos(α) = a2+ b2− 2ab cos(α).The angle definition works in any space with a dot product. In statistics you have to workwith vectors of n components. They are called data or random variables and cos(α) is called thecorrelation between two random variables ~v, ~w of zero expectation E[~v] = (v1+···+vn)/n. Thedot product v1w1+ . . . + vnwnis then the covariance, the length |v| is the standard deviationand denoted by σ(v). The formula Corr[v, w] = Cov[v, w]/(σ(v)σ(w)) for the correlation is thefamiliar angle formula we have seen. It is geometry in n dimensions. We mention this only toconvince you that the geometry we do here can be applied to much more. All the computationswe have done go through verbatim.The triangle inequality tells |~u + ~v| ≤ |~u| + |~v|Proof: |~u+~v|2= (~u+~v)·(~u+~v) = ~u2+~v2+2~u·~v ≤ ~u2+~v2+2|~u·~v| ≤ ~u2+~v2+2|~u|·|~v| = (|~u|+|~v|)2.Two vectors are called orthogonal or perpendicular if ~v · ~w = 0. The zero vector~0 is o rthogonal to any vector. For example, ~v = h2, 3i is orthogonal to ~w = h−3, 2i.Having given precise definitions of all objects we can now prove Pyt hagoras theorem:Pythagoras theorem: if ~v and ~w are orthogonal, then |v − w|2= |v|2+ |w|2.Proof: (~v − ~w) · (~v − ~w) …
View Full Document