With sufficient time spent on quantum mechanics, one invariably comes across the formula for the exponential of the Pauli matrices

σ1=( 01 10 ) σ2=( 0i i0 ) σ3=( 10 01 )

If these are supplemented by an identity matrix, they can be used to represent a general 2×2 Hermitian matrix as

M=[ a0+a3 a1-ia2 a1+ia2 a0-a3 ] =a0 +σ ·a

where the quantities ak are all real. Hermitian matrices are unchanged by simultaneous transposition and complex conjugation of their elements. They are important in quantum mechanics because their eigenvalues are always real.

The exponentiation formula for this matrix can be written

eM =ea0 [cosha ·a +(σ· a) sinha ·a a· a]

The linear combination inside the brackets is easy to verify using properties of the Pauli matrices,

σ1σ2 =σ2 σ1 σ2σ3 =σ3 σ2 σ3σ1 =σ1 σ3 σ12 =σ22 =σ32 =1

so that the square of a traceless Pauli matrix has the simple form

(σ· a)2 = k,m=1 3 akam σkσm =k=1 3 ak2 =a· a

For an exponential function, the traceful part of the Hermitian matrix is simply a separate factor. Then the even terms of the exponential form the hyperbolic cosine and the odd terms the hyperbolic sine divided by the norm of the real vector.

For functions other than the exponential, determining the form of f(M) with matrix multiplication is not as immediately straightforward. A more convenient approach is to diagonalize the matrix, apply the function to its eigenvalues and then invert the diagonalization. The final result is pleasingly simple and applies to all analytic functions.

The process of diagonalizing a matrix is described by a similarity transform

Mdiagonal =S1 MS

where the columns of the invertible matrix S are the normalized eigenvectors of the diagonalizable matrix. This statement can be rearranged to

M=SMdiagonal S1

An arbitrary power of the matrix is built up from products of the entire right-hand side, with adjacent products of the matrix S and its inverse cancelling to unity. The simple final result is

Mk =SMdiagonalk S1

where the powers of a diagonal matrix are evaluated as powers of the eigenvalues along the diagonal. For any function expressible as a power series one then has

f(M) =k=0 ck Mk =k=0 ck SMdiagonalk S1 =Sf( Mdiagonal) S1

Apply this to the 2×2 Hermitian matrix. The eigenvalues are a0± a· a and the corresponding normalized eigenvectors are

1n± [a1 -ia2, a3 ±a· a] n± =a12 +a22 +(a3 a· a )2

The diagonalizing matrix and its inverse are

S=[ a1 -ia2 n+ a1 -ia2 n- a3 +a· a n+ a3 -a· a n- ] S1 =[ a1 +ia2 n+ a3 +a· a n+ a1 +ia2 n- a3 -a· a n- ]

The denominators appearing in Sf(Mdiagonal) S1 will all be squared products of the normalizing factors, which can be written

1n±2 =12 (a· a) 2a3 a· a =a· a ±a3 2a· a (a12 +a22)

and it now becomes straightforward to evaluate matrix elements in

f(M) =S [ f(a0 +a· a) 0 0 f(a0 -a· a) ] S1

The individual results are

f(M)11 =(a12 +a22) [f(a0 +a· a) n+2 +f(a0 -a· a) n-2] f(M )11 =12 [f(a0 +a· a) +f(a0 -a· a)] +a3 2a· a [f(a0 +a· a) -f(a0 -a· a)]   f(M)12 =(a1 -ia2) [f(a0 +a· a) (a3 +a· a) n+2 +f(a0 -a· a) (a3 -a· a) n-2] f(M )12 =(a1 -ia2) 2a· a [f(a0 +a· a) -f(a0 -a· a)]   f(M)21 =(a1 +ia2) [f(a0 +a· a) (a3 +a· a) n+2 +f(a0 -a· a) (a3 -a· a) n-2] f(M )21 =(a1 +ia2) 2a· a [f(a0 +a· a) -f(a0 -a· a)]   f(M)22 =f(a0 +a· a) (a· a -a3)2 n+2 +f(a0 -a· a) (a· a +a3)2 n-2 f(M )22 =12 [f(a0 +a· a) +f(a0 -a· a)] -a3 2a· a [f(a0 +a· a) -f(a0 -a· a)]

which can be collected in Pauli matrix notation as

f(M) =12 [f(a0 +a· a) +f(a0 -a· a)] +(σ· a) 2a· a [f(a0 +a· a) -f(a0 -a· a)]

This is the pleasingly simple result promised. The formula for the exponential of the matrix follows immediately from recognizing that the hyperbolic cosine is half the sum of an exponential and its inverse, while the hyperbolic sine is half their difference.

As another check of the simple result, directly evaluate an arbitrary power of the Hermitian matrix using the binomial theorem:

Mk =(a0 +σ· a)k =m=0 k (km) a0m (σ· a )k-m Mk =m=0 k2 (k 2m) a0 k-2m (a· a )m +m=0 k-12 (k 2m+1) a0k -2m-1 (a· a)m (σ· a)

The corresponding statement from the pleasingly simple result is

Mk =12 [(a0 +a· a )k +(a0 -a· a )k] +(σ· a) 2a· a [(a0 +a· a )k -(a0 -a· a )k]

When the powers on the right-hand side are expanded, subtractive cancellation will leave behind only even powers of k for the first bracketed terms and only odd powers of k for the second bracketed terms. The final summations will be exactly the same as those in the direct evaluation.

In retrospect the equivalence of these two forms for an arbitrary power of the Hermitian matrix is completely understandable, yet perhaps not completely obvious. Intuiting this second form for an arbitrary power would of course lead to the pleasingly simple result without any matrix diagonalization. Then again, if the intuiting were that obvious this simple result would appear in all quantum mechanics texts, wouldn’t it?

Uploaded 2016.11.23 analyticphysics.com