# Hessian Matrix

The Hessian Matrix is a square matrix of second ordered partial derivatives of a scalar function. It is of immense use in linear algebra as well as for determining points of local maxima or minima.

#### Contents

## General Hessian Matrix of n variables :

The above Hessian is of the the function \(f : \mathbb{R}''\to\mathbb{R}\) where all second order partial derivatives of \(f\) exist and are continuous throughout its domain & the function is \(f(x_1,x_2,x_3,\cdots,x_n)\)

## Conditions for Minima,Maxima,Saddle point

The Hessian of a function is denoted by \(\Delta^2f(x,y)\) where \(f\) is a twice differentiable function & if \((x_0,y_0)\) is one of its stationary points then :

- If \(\Delta^2f(x_0,y_0)>0\) i.e positive definite , \((x_0,y_0)\) is a point of local minimum.
- If \(\Delta^2f(x_0,y_0)<0\) , i.e. negative definite , \((x_0,y_0)\) is a point of local maximum.
- If \(\Delta^2f(x_0,y_0)\) is neither positive nor negative i.e. Indefinite , \((x_0,y_0)\) is a saddle point

## Hessian in two variables

Usually Hessian in two variables are easy and interesting to look for. A function \(f:\mathbb{R}\to\mathbb{R}\) whose second order partial derivatives are well defined in its domain so we can have the Hessian matrix of \(f\) .

Note that the Hessian matrix here is always symmetric.

Let the function \(f(x,y)= x^2+y^2\) satisfies that its second order partial derivatives exist & they're continuous throughout the Domain .

\(\large \begin{cases} \frac{\partial{f}^2}{\partial{x^2}}=2 \\ \frac{\partial{f}^2}{\partial{y}\partial{x}} = 0 \\ \frac{\partial{f}^2}{\partial{x}\partial{y}}=0 \\ \frac{\partial{f}^2}{\partial{y^2}} = 2\end{cases}\)

Then its Hessian is given & denoted by :

\(\large \Delta^2f(x,y) = \begin{pmatrix} \frac{\partial{f}^2}{\partial{x^2}} & \frac{\partial{f}^2}{\partial{y}\partial{x}} \\ \frac{\partial{f}^2}{\partial{x}\partial{y}} & \frac{\partial{f}^2}{\partial{y^2}} \end{pmatrix}\)

\(\large \Delta^2f(x,y) = \begin{pmatrix} 2&0 \\ 0&2 \end{pmatrix}\)

The following example will demonstrate the facts clearly & explain its uses.

Suppose a function is defined by \(f(x,y)=x^4-32x^2+y^4-18y^2\) . Find the maximum & minimum value of the function if it exists. Justify your answer.

We take the double derivatives of the function as follows :

\(\large \begin{cases} \frac{\partial{f}^2}{\partial{x^2}}=12x^2-64 \\ \frac{\partial{f}^2}{\partial{y}\partial{x}} = 0 \\ \frac{\partial{f}^2}{\partial{x}\partial{y}}=0 \\ \frac{\partial{f}^2}{\partial{y^2}} = 12y^2-36\end{cases}\)

The Hessian is defined by :

\(\large \Delta^2f(x,y) = \begin{pmatrix} 12x^2-64&0 \\ 0&12y^2-36 \end{pmatrix}\)

We solve for the Stationary points of the function \(f(x,y)\) by equating its partial derivatives \(\frac{\partial{f}}{\partial{x}}\) & \(\frac{\partial{f}}{\partial{y}}\) to zero.

\(\begin{cases} 4x(x^2-16)=0\implies x=\pm4,0 \\ 4y(y^2-9)=0\implies y=\pm3,0\end{cases}\)

The possible pairings gives us the critical points \((0,0),(\pm4,\pm3),(\pm4,0),(0,\pm3)\) .

Now as the Hessian consists of even functions which reduces a lot of effort. we only need to check for the pairs \((4,3),(0,0),(4,0),(0,3)\)

For a brief knowledge of Definite & indefinite matrices study these first.

Now we check the Hessian at different stationary points as follows :

\(\large \Delta^2f(0,0) = \begin{pmatrix} -64 &0 \\ 0 & -36\end{pmatrix}\)

This is negative definite making it a local maximum of the function .

\(\large \Delta^2f(\pm4,0) = \begin{pmatrix} 128 &0 \\ 0 & -36\end{pmatrix}\)

It's indefinite thus ruled out.

\(\large \Delta^2f(0,\pm3) = \begin{pmatrix} -64 &0 \\ 0 & 72\end{pmatrix}\)

It's also indefinite

\(\large \Delta^2f(\pm4,\pm3) = \begin{pmatrix} 128 &0 \\ 0 & 72\end{pmatrix}\)

It's positive definite matrix and thus it's the local minimum of the function.

So \(f(0,0)\ge f(x,y)\ge f(\pm4,\pm3)\implies -337\le f(x,y)\le0 \)

Thus we have successfully bounded the above function and its point of local minimum is \((\pm4,\pm3)\) & point of local maximum is \((0,0)\)