Introduction to Tensor
-
Upload
jairo-martinez-escobar -
Category
Documents
-
view
63 -
download
3
Transcript of Introduction to Tensor
Introduction to Tensors
Contravariant and covariant vectors
Rotation in 2space: x' = cos x + sin y
y' = sin x + cos y
To facilitate generalization, replace (x, y) with (x1, x2)
Prototype contravariant vector: dr = (dx1, dx2)
= cos dx1 + sin dx2
Similarly for
Same holds for r, since transformation is linear.
1
Compact notation:
(generalizes to any transformation in a space of any dimension)
Contravariant vector:
Now consider a scalar field (r): How does ∇ transform under rotations?
appears rather than
2
For rotations in Euclidean nspace:
where = angle btwn and axes
It is not the case for all spaces and transformations that
Covariant vectors:
so we define a new type of vector that transforms like the gradient:
3
Explicit demonstration for rotations in Euclidean 2space: 4
What about vectors in Minkowski space?
but => contravariant and covariant vectors are different!
5
Recap (for arbitrary space and transformation)
Contravariant vector:
Covariant vector:
For future convenience, define new notation for partial derivatives:
Note:
= Kronecker delta = 1 if i=j , 0 if i≠j
6
Tensors
Consider an Ndimensional space (with arbitrary geometry) and
an object with components in the coord system and
in the coord system.
This object is a mixed tensor, contravariant in i...k and covariant in
l...n, under the coord transformation if
Rank of tensor, M = number of indices
Total number of components = N M
Vectors are first rank tensors and scalars are zero rank tensors.
7
If space is Euclidean Nspace and transformation is rotation of Cartesian coords, then tensor is called a “Cartesian tensor”.
In Minkowski space and under Poincaré transformations, tensors are“Lorentz tensors”, or, “4tensors”.
Zero tensor 0 has all its components zero in all coord systems.
Main theorem of tensor analysis:
If two tensors of the same type have all their components equal in one coord system, then their components are equal in all coord systems.
Einstein's summation convention: repeated upper and lower indices => summation
e.g.:
8
Ai B i could also be written A
j B j ; index is a “dummy index”
Another example:
j and k are dummy indices; i is a “free index”
Summation convention also employed with , etc.
Example of a second rank tensor: Kronecker delta
9
SP 5.12
Tensor Algebra (operations for making new tensors from old tensors)
1. Sum of two tensors: add components:
Proof that sum is a tensor:(for one case)
2. Outer product: multiply components: e.g.,
3. Contraction: replace one superscript and one subscript by a dummy index pair
e.g.,
Result is a scalar if no free indices remain.
e.g,
10
4. Inner product: contraction in conjunction with outer product
e.g.:
Again, result is a scalar if no free indices remain, e.g,
5. Index permutation: e.g.,
Differentiation of Tensors
Notation: ; , etc.
11
IF transformation is linear(so that p's are all constant)
=> derivative of a tensor wrt a coordinate is a tensor only for linear transformations (like rotations and LTs)
Similarly, differentiation wrt a scalar (e.g., ) yields a tensor forlinear transformations.
12
Now specialize to Riemannian spaces
characterized by a metric with
Assume is symmetric: (no loss of generality, since they only appear in pairs)
If , then space is “strictly Riemannian”
(e.g., Euclidean Nspace)
Otherwise, space is “pseudoRiemannian” (e.g., Minkowski space)
is called the “metric tensor”.
Note that the metric tensor may be a function of position in the space.
13
Proof that is a tensor:
It's tempting to divide by and conclude
But there's a double sum over k' and l', so this isn't possible.
Instead, suppose = 1 if i' = 1= 0 otherwise
=> Similarly for , etc.
(since dxi is a vector)
(2 sets of dummy indices)
=>
14
Now suppose = 1 if i' = 1 or 2= 0 otherwise
Only contributing terms are: k'=1, l'=1 k'=1, l'=2 k'=2, l'=1 k'=2, l'=2
0 0
since is symmetric.
since i and j are dummy indices.
=> Similarly for all
15
General definition of the scalar product:
Define as the inverse matrix of :
is also a tensor, since applying tensor transformation yields
, which defines as the inverse of
Raising and lowering of indices: another tensor algebraic operation, defined for Riemannian spaces = inner product of a tensor with the metric tensor
e.g.: ;
Note: covariant and contravariant indices must be staggered when raising and lowering is anticipated.
16
4tensors
In all coord systems in Minkowski space:
=>
e.g:
17
Under standard Lorentz transformations:
e.g.:
All the other p's are zero.
18
SP 5.37