Monday, November 12, 2012

The Square of a Graph Continued.....

Article Under Construction:
I just realized that I accidentally posted what I am currently working on but haven't finished yet. I won't delete it since there are some that have already started looking at it. Just note that it is not complete, though I hope that what is here so far is nice! Below you will see some explanation about the different types of graph implementations out there. This post will primarily be concerned with the adjacency list implementation. Enjoy!

In my last post I discussed the notion of squaring graph. To recap, the square of a graph G has the same vertex set as G and two vertices in the square are connected via an edge if they are at distance at most two in G.  Because each vertex is a distance of zero from itself there will be loops in the square of G. A simple way of determining G^2 involves matrix calculations, (mainly multiplications), with the adjacency matrix of the graph that is being squared. Since matrix multiplication is the dominant operation in this simple method the complexity is around O(n^3) where n is the size of the vertex set of G; not very good if you have a lot of vertices to look at in your graph.

So....a natural question would be: is there a faster method for squaring a graph? It's a question that I am still mulling over and will continue to address in this blog, though for the time being I have at least thought of an alternative method for determining the square of a graph that does not involve matrices or matrix calculations. My method relies on another popular implementation of graphs: that which involves so-called adjacency lists. What I will discuss in the following paragraphs will sort of be a mash up of linked lists and queues with the words adjacent and adjacency popping up a lot. Let's see what this is all about.

So...., what's an adjacency list? Well first of all, let's look at adjacency matrices in a little bit more detail. If you have a dense graph with lots of edges relative to vertices, you'll expect to see significantly more ones than zeros in the associated adjacency matrix. That's okay. Your matrix has a lot of space to store graph information but almost all of the space is being used as needed. No problem. However, consider the opposite case: your graph is sparse with relatively few edges to vertices. If you draw out your sparse graph with the dots and lines shown in previous posts, it may look kinda flimsy; not a whole lot of edges holding your vertices together. A simple example of a sparse graph would be a tree (has exactly n-1 edges defined for its n vertices). Below is a picture of some trees for reference. This same "flimsiness" will be reflected in the adjacency matrix of the graph in the form of lots of zeros. Since all of those zeros simply reflect the lack of connection between two vertices, using an entire n by n matrix for a sparse graph would be wasting space (and as n continues to get larger a considerable waste of space). That's where adjacency lists come in. Adjacency lists are basically linked lists that store only the information that is necessary for a graph.

In the adjacency list implementation of a graph, each vertex is assigned a linked list of its own that contains references to all of its neighbors. If vertex A is not adjacent to vertex B then neither of them will have a reference to each other in their associated adjacency lists. This can save a lot of space, though in terms of any drawbacks I should note that it can involve more searching time due to the nature of linked lists. Any readers that are curious about this last statement can refer to any source on linked lists which should provide details on their searching time complexities. Below is a picture of a graph along with a drawing of its associated adjacency list storage to give you a better idea of how this works.