# Adjacency Matrix

An **adjacency matrix** is a compact way to represent the structure of a finite graph. If a graph has $n$ vertices, its adjacency matrix is an $n \times n$ matrix, where each entry represents the number of edges from one vertex to another.

Some properties and computations in graph theory can be carried out efficiently and elegantly by using a graph's adjacency matrix. Moreover, certain questions about properties of the adjacency matrix correspond to interesting questions about the associated graph.

## Building an Adjacency Matrix

Given a graph $G$ with $n$ vertices, the adjacency matrix is an $n \times n$ matrix
$A = \begin{pmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1} & a_{n2} & \cdots & a_{nn} \end{pmatrix},$
where the value $a_{ij}$ equals the number of edges from the vertex $i$ to the vertex $j.$ If the graph is undirected, then $a_{ij} = a_{ji}$ for all $i,j;$ that is, the adjacency matrix is **symmetric**.

When the graph is simple, the entries of the adjacency matrix are all $0$ or $1$ (there are no weights on the edges, or multiple edges) and the diagonal entries are all $0$ (there are no loops). More complicated types of graphs give rise to more kinds of matrices.

Create an adjacency matrix for the following weighted, undirected graph:

The weights on the edges go into the entries of the adjacency matrix, as follows: $A = \begin{pmatrix} 0&3&0&0&0&12&0 \\ 3&0&5&0&0&0&4 \\ 0&5&0&6&0&0&3 \\ 0&0&6&0&1&0&0 \\ 0&0&0&1&0&10&7 \\ 12&0&0&0&10&0&2 \\ 0&4&3&0&7&2&0 \end{pmatrix}.\ _\square$

This example suggests some follow-up questions, which will be addressed in a later section:

- Note that if the vertices were labeled differently, the adjacency matrix would change.
- How would it change, and how would the new adjacency matrix be related to the old one?
- Can this relationship be expressed in matrix language?

## Matrix Properties and Graph Properties

The adjacency matrix is an array of numbers that represents all the information about the graph. Some of the properties of the graph correspond to interesting properties of its adjacency matrix, and vice versa. Here is a simple example:

Let $G$ be a simple graph with $n$ vertices and $m$ edges: that is, $G$ is undirected, unweighted, and has no loops (edges from a vertex to itself).

Let $A$ be the adjacency matrix of $G$ and let ${\bf u} = \begin{pmatrix} 1&1&\ldots&1 \end{pmatrix}$ be the $1 \times n$ matrix whose entries are all equal to $1.$ Then ${\bf u} A {\bf u}^T$ is a $1 \times 1$ matrix $\begin{pmatrix} b \end{pmatrix}$. What is $b$?

**Notation**: if $B$ is a matrix, $B^T$ denotes its transpose.

**Powers:** One of the most well-known ways to get information about the graph from operations on the adjacency matrix is via its powers. As the following example shows, the entries of the powers of the adjacency matrix give information about paths in the graph.

Let $A$ be the adjacency matrix of a simple graph with $6$ vertices. Suppose the entry of $A^2$ in the $3^\text{rd}$ row and the $4^\text{th}$ column is $5.$ What does this mean about the graph?

Suppose $A = (a_{ij}).$ Then the $3,4$ entry of $A^2$ is

$a_{31}a_{14} + a_{32}a_{24} + a_{33}a_{34} + a_{34}a_{44} + a_{35}a_{54} + a_{36}a_{64}.$

Consider $a_{31}a_{14}.$ This counts the number of paths from vertex 3 to 1, multiplied by the number of paths from vertex 1 to vertex 4. This is exactly the number of two-step paths from vertex 3 to vertex 4 which pass through vertex 1. The other terms $a_{3k}a_{k4}$ count the number of two-step paths from vertex 3 to vertex 4 which pass through vertex $k.$ So the sum counts all of the two-step paths from vertex 3 to vertex 4. The conclusion is that there are $5$ two-step paths from vertex 3 to vertex 4. $_\square$

So the $i,j$ entries of $A^2$ count two-step walks from $i$ to $j.$ This generalizes as follows:

Let $A$ be the adjacency matrix of a graph. The $i,j$ entry of $A^n$ counts $n$-step walks from vertex $i$ to vertex $j.$

$($The proof is a straightforward induction on $n,$ where the argument is essentially the same as the one given above for $A^2.)$

**Spectrum:** The study of the eigenvalues of the adjacency matrix of a graph is called *spectral graph theory*. Here is a simple example of a correspondence between graph properties and eigenvectors of the adjacency matrix:

An undirected graph is

$k$-regularif the degree of each vertex is $k.$ What can be said about the eigenvalues and eigenvectors of the adjacency matrix of a $k$-regular graph?

Let $A$ be the adjacency matrix of a $k$-regular graph. Let $\bf v$ be the all-ones column vector in ${\mathbb R}^n.$ Then the $i^\text{th}$ entry of $A{\bf v}$ equals the sum of the entries in the $i^\text{th}$ row of $A.$ This is the number of edges emanating from vertex $i,$ which is exactly $k.$ So $A \begin{pmatrix} 1\\1\\\vdots\\1 \end{pmatrix} = \begin{pmatrix} k\\k\\\vdots\\k \end{pmatrix} = k\begin{pmatrix} 1\\1\\\vdots\\1 \end{pmatrix},$ i.e. $\bf v$ is an eigenvector of $A$ with eigenvalue $k.$ $_\square$

**Isomorphisms:** Two graphs are said to be isomorphic if one can be obtained from the other by relabeling vertices. Note that isomorphic graphs need not have the same adjacency matrix, since the adjacency matrix depends on the labeling of the vertices. But the adjacency matrices of isomorphic graphs are very closely related.

Let $G$ and $H$ be graphs on $n$ vertices with adjacency matrices $A$ and $B.$ Then $G$ and $H$ are isomorphic if and only if there is a permutation matrix $P$ such that $B = PAP^{-1}.$

Recall that a permutation matrix is a square matrix whose entries are all $0$ or $1,$ such that each row and each column contains exactly one $1.$

## Applications

## See also

**Cite as:**Adjacency Matrix.

*Brilliant.org*. Retrieved from https://brilliant.org/wiki/adjacency-matrix/