Quantum One: Lecture 17 1. 2 Ket-Bra Expansions and Integral Representations of Operators 3.

Post on 15-Jan-2016

219 views 3 download

Tags:

Transcript of Quantum One: Lecture 17 1. 2 Ket-Bra Expansions and Integral Representations of Operators 3.

Quantum One: Lecture 17

1

2

Ket-Bra Expansions and Integral Representations of Operators

3

In the last lecture, we defined what we mean by Hermitian operators, anti-Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts.

We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators, and saw how these matrix representations can be used to directly compute quantities related to the operators they represent.

Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices.

In this lecture, we extend some of these ideas to continuously indexed bases sets, and develop integral representations of linear operators.

4

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space.

Then from the trivial identity

we can write

5

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space.

Then from the trivial identity

we can write

6

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space.

Then from the trivial identity

we can write

7

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space.

Then from the trivial identity

we can write

8

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space.

From the trivial identity

or

This gives what we call a ket-bra expansion for this operator in this representation, in which appear the matrix elements

of A connecting the basis states |n and |n .⟩ ′⟩

9

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space.

From the trivial identity

or

This gives what we call a ket-bra expansion for this operator in this representation, and completely specifies the linear operator A in terms of its matrix elements

taken between the basis states of this representation.

10

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space.

From the trivial identity

or

This gives what we call a ket-bra expansion for this operator in this representation, and completely specifies the linear operator A in terms of its matrix elements

taken between the basis states of this representation.

11

Integral Representation of Operators

Thus in the wave function representation induced by any continuous ONB, an operator A is naturally represented by an integral kernel,

which is a function of two continuous indices, or arguments, the values of which that are just the matrix elements of A connecting the different members of the basis states defining that continuous representation.

Like the matrices associated with discrete representations, knowledge of the kernel facilitates computing quantities related to A itself.

12

Integral Representation of Operators

Thus in the wave function representation induced by any continuous ONB, an operator A is naturally represented by an integral kernel,

which is a function of two continuous indices, or arguments, the values of which that are just the matrix elements of A connecting the different members of the basis states defining that continuous representation.

Like the matrices associated with discrete representations, knowledge of the kernel facilitates computing quantities related to A itself.

13

Integral Representation of Operators

Thus in the wave function representation induced by any continuous ONB, an operator A is naturally represented by an integral kernel,

which is a function of two continuous indices, or arguments, the values of which are just the matrix elements of A connecting the different members of the basis states defining that continuous representation.

Like the matrices associated with discrete representations, knowledge of the kernel facilitates computing quantities related to A itself.

14

Integral Representation of Operators

Thus in the wave function representation induced by any continuous ONB, an operator A is naturally represented by an integral kernel,

which is a function of two continuous indices, or arguments, the values of which are just the matrix elements of A connecting the different members of the basis states defining that continuous representation.

Like the matrices associated with discrete representations, knowledge of the kernel facilitates computing quantities related to A itself.

15

Integral Representation of Operators

Thus, suppose that

for some states and .

The expansion coefficients for the states and are then clearly related.

Note that if

then

which can be written, rather like a continuous matrix operation

16

Integral Representation of Operators

Thus, suppose that

for some states and .

The expansion coefficients for the states and are then clearly related.

Note that if

then

which can be written, rather like a continuous matrix operation

17

Integral Representation of Operators

Thus, suppose that

for some states and .

The expansion coefficients for the states and are then clearly related.

Note that if

then

which can be written, rather like a continuous matrix operation

18

Integral Representation of Operators

Thus, suppose that

for some states and .

The expansion coefficients for the states and are then clearly related.

Note that if

then

which can be written, rather like a continuous matrix operation

19

Integral Representation of Operators

Thus, suppose that

for some states and .

The expansion coefficients for the states and are then clearly related.

Note that if

then

which can be written, rather like a continuous matrix operation

20

Integral Representation of Operators

Thus, suppose that

for some states and .

The expansion coefficients for the states and are then clearly related.

Note that if

then

which can be written, rather like a continuous matrix operation

21

Integral Representation of Operators

Thus, suppose that

for some states and .

The expansion coefficients for the states and are then clearly related.

Note that if

then

which can be written, rather like a continuous matrix operation

22

Integral Representation of Operators

Consider the matrix element of A between arbitrary states and

Inserting our expansion for A this becomes

where identifying the wave functions for the two states involved we can write

which is the continuous version of product of a row-vector, a square matrix, and a column-vector.

23

Integral Representation of Operators

Consider the matrix element of A between arbitrary states and

Inserting our expansion for A this becomes

where identifying the wave functions for the two states involved we can write

which is the continuous version of product of a row-vector, a square matrix, and a column-vector.

24

Integral Representation of Operators

Consider the matrix element of A between arbitrary states and

Inserting our expansion for A this becomes

where identifying the wave functions for the two states involved we can write

which is the continuous version of product of a row-vector, a square matrix, and a column-vector.

25

Integral Representation of Operators

Consider the matrix element of A between arbitrary states and

Inserting our expansion for A this becomes

where identifying the wave functions for the two states involved we can write

which is the continuous version of product of a row-vector, a square matrix, and a column-vector.

26

Integral Representation of Operators As another example, consider the operator product of

and

The operator product has a similar expansion, i.e.,

where

which gives the continuous analog of a matrix multiplication, i.e.,

27

Integral Representation of Operators As another example, consider the operator product of

and

The operator product has a similar expansion, i.e.,

where

which gives the continuous analog of a matrix multiplication, i.e.,

28

Integral Representation of Operators As another example, consider the operator product of

and

The operator product has a similar expansion, i.e.,

where

which gives the continuous analog of a matrix multiplication, i.e.,

29

Integral Representation of Operators As another example, consider the operator product of

and

The operator product has a similar expansion, i.e.,

where

which gives the continuous analog of a matrix multiplication, i.e.,

30

Integral Representation of Operators As another example, consider the operator product of

and

The operator product has a similar expansion, i.e.,

where

which gives the continuous analog of a matrix multiplication, i.e.,

31

Integral Representation of Operators As another example, consider the operator product of

and

The operator product has a similar expansion, i.e.,

where

which gives the continuous analog of a matrix multiplication, i.e.,

32

Integral Representation of Operators As another example, consider the operator product of

and

The operator product has a similar expansion, i.e.,

where

which gives the continuous analog of a matrix multiplication, i.e.,

33

Integral Representation of Operators As another example, consider the operator product of

and

The operator product has a similar expansion, i.e.,

where

which gives the continuous analog of a matrix multiplication, i.e.,

34

Integral Representation of Operators So if we know the kernels and

representing A and B, we can compute the kernel representing C = AB through the integral relation

35

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If

then by the two-part rule we developed for taking the adjoint, it follows that

We can now switch the prime on the integration variables, and reorder, to find that

from which we deduce that 36

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If

then by the two-part rule we developed for taking the adjoint, it follows that

We can now switch the prime on the integration variables, and reorder, to find that

from which we deduce that 37

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If

then by the two-part rule we developed for taking the adjoint, it follows that

We can now switch the prime on the integration variables, and reorder, to find that

from which we deduce that 38

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If

then by the two-part rule we developed for taking the adjoint, it follows that

We can now switch the prime on the integration variables, and reorder, to find that

from which we deduce that 39

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If

then by the two-part rule we developed for taking the adjoint, it follows that

We can now switch the prime on the integration variables, and reorder, to find that

from which we deduce that 40

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If

then by the two-part rule we developed for taking the adjoint, it follows that

We can now switch the prime on the integration variables, and reorder, to find that

from which we deduce that 41

This, obviously, is just the continuous analog of the complex-conjugate transpose of a matrix

A Hermitian operator is equal to its adjoint, so that the integral kernels representing Hermitian operators obey the relation

42

This, obviously, is just the continuous analog of the complex-conjugate transpose of a matrix

A Hermitian operator is equal to its adjoint, so that the integral kernels representing Hermitian operators obey the relation

43

Examples: As an example, in 3D, the operator X has as its matrix elements in the position representation

This allows us to construct the expansion for this operator

where the double integral has been reduced to a single integral because of the delta function.

The operator X is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

44

Examples: As an example, in 3D, the operator X has as its matrix elements in the position representation

This allows us to construct the expansion for this operator

where the double integral has been reduced to a single integral because of the delta function.

The operator X is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

45

Examples: As an example, in 3D, the operator X has as its matrix elements in the position representation

This allows us to construct the expansion for this operator

where the double integral has been reduced to a single integral because of the delta function.

The operator X is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

46

Examples: As an example, in 3D, the operator X has as its matrix elements in the position representation

This allows us to construct the expansion for this operator

where the double integral has been reduced to a single integral because of the delta function.

The operator X is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

47

Examples: As an example, in 3D, the operator X has as its matrix elements in the position representation

This allows us to construct the expansion for this operator

where the double integral has been reduced to a single integral because of the delta function.

The operator X is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

48

Examples: As an example, in 3D, the operator X has as its matrix elements in the position representation

This allows us to construct the expansion for this operator

where the double integral has been reduced to a single integral because of the delta function.

The operator X is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

49

Examples: As an example, in 3D, the operator X has as its matrix elements in the position representation

This allows us to construct the expansion for this operator

where the double integral has been reduced to a single integral because of the delta function.

The operator X is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

50

Examples: As an example, in 3D, the operator X has as its matrix elements in the position representation

This allows us to construct the expansion for this operator

where the double integral has been reduced to a single integral because of the delta function.

The operator X is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

51

This concept of diagonality extends to arbitrary representations.

An operator A is said to be diagonal in the discrete representation

if

so that

which only has one summation index,

in contrast to the general form which requires two.

52

This concept of diagonality extends to arbitrary representations.

An operator A is said to be diagonal in the discrete representation

if

so that

which only has one summation index,

in contrast to the general form which requires two.

53

This concept of diagonality extends to arbitrary representations.

An operator A is said to be diagonal in the discrete representation

if

so that

which only has one summation index,

in contrast to the general form which requires two.

54

This concept of diagonality extends to arbitrary representations.

An operator A is said to be diagonal in the discrete representation

if

so that

which only has one summation index,

in contrast to the general form which requires two.

55

In a discrete representation, an operator that is diagonal in that representation is represented by a diagonal matrix, i.e., if

then

56

In a discrete representation, an operator that is diagonal in that representation is represented by a diagonal matrix, i.e., if

then

57

Similarly, in a continuous representation an operator A is diagonal if

so that

which ends up with only one integration variable, i.e.,

in contrast to the general form which requires two.

58

Similarly, in a continuous representation an operator A is diagonal if

so that

which ends up with only one integration variable, i.e.,

in contrast to the general form which requires two.

59

Similarly, in a continuous representation an operator A is diagonal if

so that

which ends up with only one integration variable, i.e.,

in contrast to the general form which requires two.

60

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation.

That is, if

is diagonal in the representation, and if

then

which shows that a diagonal operator G acts in the representation to multiply the wave function by .

61

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation.

That is, if

is diagonal in the representation, and if

then

which shows that a diagonal operator G acts in the representation to multiply the wave function by .

62

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation.

That is, if

is diagonal in the representation, and if

then

which shows that a diagonal operator G acts in the representation to multiply the wave function by .

63

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation.

That is, if

is diagonal in the representation, and if

then

which shows that a diagonal operator G acts in the representation to multiply the wave function by .

64

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation.

That is, if

is diagonal in the representation, and if

then

which shows that a diagonal operator G acts in the representation to multiply the wave function by .

65

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation.

That is, if

is diagonal in the representation, and if

then

which shows that a diagonal operator G acts in the representation to multiply the wave function by .

66

We list below ket-bra expansions and matrix elements of important operators.

The position operator

The potential energy operator

The wavevector operator

The momentum operator

The kinetic energy operator

67

We list below ket-bra expansions and matrix elements of important operators.

The position operator

The potential energy operator

The wavevector operator

The momentum operator

The kinetic energy operator

68

We list below ket-bra expansions and matrix elements of important operators.

The position operator

The potential energy operator

The wavevector operator

The momentum operator

The kinetic energy operator

69

We list below ket-bra expansions and matrix elements of important operators.

The position operator

The potential energy operator

The wavevector operator

The momentum operator

The kinetic energy operator

70

We list below ket-bra expansions and matrix elements of important operators.

The position operator

The potential energy operator

The wavevector operator

The momentum operator

The kinetic energy operator

71

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian.

For example, we can take the Hermitian adjoint of the position operator

by replacing each term in this continuous summation with its adjoint.

Thus we easily see that

so the position operator (and each of its components) is clearly Hermitian.

72

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian.

For example, we can take the Hermitian adjoint of the position operator

by replacing each term in this continuous summation with its adjoint.

Thus we easily see that

so the position operator (and each of its components) is clearly Hermitian.

73

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian.

For example, we can take the Hermitian adjoint of the position operator

by replacing each term in this continuous summation with its adjoint.

Thus we easily see that

so the position operator (and each of its components) is clearly Hermitian.

74

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian.

For example, we can take the Hermitian adjoint of the position operator

by replacing each term in this continuous summation with its adjoint.

Thus we easily see that

so the position operator (and each of its components) is clearly Hermitian.

75

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian.

For example, we can take the Hermitian adjoint of the position operator

by replacing each term in this continuous summation with its adjoint.

Thus we easily see that

so the position operator (and each of its components) is clearly Hermitian.

76

You should verify to yourself that each of the basic operators whose diagonal ket-bra expansion we previously displayed is also Hermitian.

It follows that the wavevector operator is also Hermitian,

and so the operator D=iK,

satisfies the relation D⁺=-iK⁺=-iK=-D

Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator,

That’s why we traded it in for the wavevector operator.

77

You should verify to yourself that each of the basic operators whose diagonal ket-bra expansion we previously displayed is also Hermitian.

It follows that the wavevector operator is also Hermitian,

and so the operator D=iK,

satisfies the relation D⁺=-iK⁺=-iK=-D

Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator,

That’s why we traded it in for the wavevector operator.

78

You should verify to yourself that each of the basic operators whose diagonal ket-bra expansion we previously displayed is also Hermitian.

It follows that the wavevector operator is also Hermitian,

and so the operator D=iK,

satisfies the relation D⁺=-iK⁺=-iK=-D

Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator,

That’s why we traded it in for the wavevector operator.

79

You should verify to yourself that each of the basic operators whose diagonal ket-bra expansion we previously displayed is also Hermitian.

It follows that the wavevector operator is also Hermitian,

and so the operator D=iK,

satisfies the relation D⁺=-iK⁺=-iK=-D

Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator,

That’s why we traded it in for the wavevector operator.

80

You should verify to yourself that each of the basic operators whose diagonal ket-bra expansion we previously displayed is also Hermitian.

It follows that the wavevector operator is also Hermitian,

and so the operator D=iK,

satisfies the relation D⁺=-iK⁺=-iK=-D

Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator,

That’s why we traded it in for the wavevector operator.

81

You should verify to yourself that each of the basic operators whose diagonal ket-bra expansion we previously displayed is also Hermitian.

It follows that the wavevector operator is also Hermitian,

and so the operator D=iK,

satisfies the relation D⁺=-iK⁺=-iK=-D

Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator,

That’s why we traded it in for the wavevector operator.

82

You should verify to yourself that each of the basic operators whose diagonal ket-bra expansion we previously displayed is also Hermitian.

It follows that the wavevector operator is also Hermitian,

and so the operator D=iK,

satisfies the relation D⁺=-iK⁺=-iK=-D

Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator,

That’s why we traded it in for the wavevector operator.

83

You should verify to yourself that each of the basic operators whose diagonal ket-bra expansion we previously displayed is also Hermitian.

It follows that the wavevector operator is also Hermitian,

and so the operator D=iK,

satisfies the relation D⁺=-iK⁺=-iK=-D

Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator,

That’s why we traded it in for the wavevector operator.

84

As an additional example, we work out below the matrix elements of the wavevector operator K in the position representation.

Recall that for any state |ψ for which the state has a ⟩position wave function given by the expression

But we we can also write

where

The right hand side of this last expression seems to be the wave function in the position representation for the state .

85

As an additional example, we work out below the matrix elements of the wavevector operator K in the position representation.

Recall that for any state |ψ for which the state has a ⟩position wave function given by the expression

But we we can also write

where

The right hand side of this last expression seems to be the wave function in the position representation for the state .

86

As an additional example, we work out below the matrix elements of the wavevector operator K in the position representation.

Recall that for any state |ψ for which the state has a ⟩position wave function given by the expression

But we we can also write

where

The right hand side of this last expression seems to be the wave function in the position representation for the state .

87

As an additional example, we work out below the matrix elements of the wavevector operator K in the position representation.

Recall that for any state |ψ for which the state has a ⟩position wave function given by the expression

But we we can also write

where

The right hand side of this last expression seems to be the wave function in the position representation for the state .

88

As an additional example, we work out below the matrix elements of the wavevector operator K in the position representation.

Recall that for any state |ψ for which the state has a ⟩position wave function given by the expression

But we we can also write

where

The right hand side of this last expression seems to be the wave function in the position representation for the state .

89

Reminding ourselves of the position eigenfunctions

we see that, evidently

i.e.,

is -i times the gradient of the delta function.

The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

90

Reminding ourselves of the position eigenfunctions

we see that, evidently

i.e.,

is -i times the gradient of the delta function.

The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

91

Reminding ourselves of the position eigenfunctions

we see that, evidently

i.e.,

is -i times the gradient of the delta function.

The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

92

Reminding ourselves of the position eigenfunctions

we see that, evidently

i.e.,

is -i times the gradient of the delta function.

The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

93

Reminding ourselves of the position eigenfunctions

we see that, evidently

i.e.,

is -i times the gradient of the delta function.

The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

94

We deduce, therefore that can be expanded in the position representation in the form

so that when we apply this to any state , we obtain

consistent with our previous definition.95

We deduce, therefore that can be expanded in the position representation in the form

so that when we apply this to any state , we obtain

consistent with our previous definition.96

We deduce, therefore that can be expanded in the position representation in the form

so that when we apply this to any state , we obtain

consistent with our previous definition.97

We deduce, therefore that can be expanded in the position representation in the form

so that when we apply this to any state , we obtain

consistent with our previous definition.98

We deduce, therefore that can be expanded in the position representation in the form

so that when we apply this to any state , we obtain

consistent with our previous definition.99

In a similar fashion, one finds that

and that

100

In a similar fashion, one finds that

and that

101

In this lecture, we used the completeness relation for continuous ONBs to develop ket-bra expansions, and integral representations of linear operators.

We then saw how the integral kernel associated with these representations can be used to directly compute quantities related to the operators they represent.

We also introduced the notion of diagonality of an operator in a given representation, and developed expansions for the basic operators of a single particle in representations in which they are diagonal.

Finally, we saw how differential operators can be expressed as ket-bra expansions with integral kernels that involve derivatives the delta function, so that we could understand how a linear operator acting on kets, can somehow end up replacing the wave function with its derivative or gradient.

In the next lecture we consider still other, representation independent, properties of linear operators.

102

In this lecture, we used the completeness relation for continuous ONBs to develop ket-bra expansions, and integral representations of linear operators.

We then saw how the integral kernel associated with these representations can be used to directly compute quantities related to the operators they represent.

We also introduced the notion of diagonality of an operator in a given representation, and developed expansions for the basic operators of a single particle in representations in which they are diagonal.

Finally, we saw how differential operators can be expressed as ket-bra expansions with integral kernels that involve derivatives the delta function, so that we could understand how a linear operator acting on kets, can somehow end up replacing the wave function with its derivative or gradient.

In the next lecture we consider still other, representation independent, properties of linear operators.

103

In this lecture, we used the completeness relation for continuous ONBs to develop ket-bra expansions, and integral representations of linear operators.

We then saw how the integral kernel associated with these representations can be used to directly compute quantities related to the operators they represent.

We also introduced the notion of diagonality of an operator in a given representation, and developed expansions for the basic operators of a single particle in representations in which they are diagonal.

Finally, we saw how differential operators can be expressed as ket-bra expansions with integral kernels that involve derivatives the delta function, so that we could understand how a linear operator acting on kets, can somehow end up replacing the wave function with its derivative or gradient.

In the next lecture we consider still other, representation independent, properties of linear operators.

104

In this lecture, we used the completeness relation for continuous ONBs to develop ket-bra expansions, and integral representations of linear operators.

We then saw how the integral kernel associated with these representations can be used to directly compute quantities related to the operators they represent.

We also introduced the notion of diagonality of an operator in a given representation, and developed expansions for the basic operators of a single particle in representations in which they are diagonal.

Finally, we saw how differential operators can be expressed as ket-bra expansions with integral kernels that involve derivatives of the delta function, so that we could understand how a linear operator acting on kets, can somehow end up replacing the wave function with its derivative or gradient.

In the next lecture we consider still other, representation independent, properties of linear operators.

105

In this lecture, we used the completeness relation for continuous ONBs to develop ket-bra expansions, and integral representations of linear operators.

We then saw how the integral kernel associated with these representations can be used to directly compute quantities related to the operators they represent.

We also introduced the notion of diagonality of an operator in a given representation, and developed expansions for the basic operators of a single particle in representations in which they are diagonal.

Finally, we saw how differential operators can be expressed as ket-bra expansions with integral kernels that involve derivatives the delta function, so that we could understand how a linear operator acting on kets, can somehow end up replacing the wave function with its derivative or gradient.

In the next lecture we consider still other, representation independent, properties of linear operators.

106

107