Tuesday, July 23, 2013

Strongly Regular Graphs

So the post before last I wrote a summary of my summer RMMC trip to Wyoming. As readers of that post will know, the summer topic selected for that annual conference was algebraic graph theory. Many, many, MANY things were discussed there, so many that to write about all of them would be an unrealistic endeavor. It would be a shame not to mention at least one topic from the conference though, so today I will be discussing a particular class of graphs that turned up in several of the contributed RMMC talks: strongly regular graphs. 

Those with some acquaintance with graph theory have most likely heard of regular graphs at the least. Those are the simple graphs (no loops nor multiple edges) where all of the vertices have the same nonnegative degree k. Examples of this include the empty graph (just includes a bunch of isolated points) of regular degree 0, the complete graph of regular degree n-1 (assuming n is the number of vertices in said graph), the Petersen graph (imagine what the drawing of the cocky looking guy above would look like if we only included the black dots and lines between dots) of regular degree 3, simple cycles which are regular of degree 2, and so on. 

The focus of this posting is on undirected graphs, but for the sake of completeness it should also be noted that the notion of regularity can also be extended to directed graphs or digraphs. In this case saying that a digraph is regular means that the indegree of each vertex in the graph equals its outdegree. A very simple example of a regular directed graph is a cycle that has had its edges oriented clockwise or counterclockwise. 

With some basic notion of graph regularity we can know move on to the main post topic: strong graph regularity. When I hear strong regularity, I immediately think the implication is that a graph with such a property is regular in many ways, and this turns out to be true. In order to define a strongly regular graph a total of four parameters must be specified: n, k, lambda, and u. n is the number of vertices in the graph, k the degree of all vertices (i.e. strongly regular graphs are regular as defined previously), lambda is the number of common neighbors that every pair of adjacent vertices in the graph share, and u is the number of common neighbors that any two non-adjacent vertices of the graph share. Like many things in graph theory, the way to understand this is through visual example, so consider the Petersen graph. It has a total of n = 10 vertices and is regular of degree 3. A quick inspection of the graph should reveal that any two adjacent vertices share no common neighbors (implying that the graph is triangle free) and any two nonadjacent vertices have exactly one neighbor in common. By the above definition then we can classify the Petersen graph as a strongly regular ones with parameters 10, 3, 0 and 1 respectively. This is often expressed more compactly by saying that the Petersen graph is srg(10, 3, 0, 1). 

So what makes strongly regular graphs so special besides their satisfaction of numerous regularity conditions? Well for one they have some interesting algebraic properties. If one was to calculate the eigenvalues of a strongly regular graph, it would be found to have exactly three distinct ones. Even more so, these eigenvalues can be expressed in terms of the parameters n, k, lambda and u. The complete derivation involves a few clever observations and careful tinkering with some equations. 

The first eigenvalue is not hard to figure out. Because a strongly regular graph is regular of degree k, it must have k as an eigenvalue. The way to see this is by looking at the multiplication of the graph's adjacency matrix A with the all one's eigenvector. Since each row of A has k ones, multiplying the two will result in entries of k in the resulting vector (i.e. k times the original all one's vector). 

Now, how about the remaining eigenvalues? To find them we have to make a few key observations. For one, remember that the (i,j)th entry of the adjacency matrix A to some positive power k is equal to the number of walks of length k from vertex i to vertex j. Knowing this we can deduce that the entries along the diagonal of A^2 are equal to the number of walks of length two from each vertex to itself. This is simply equal to the degree of each of the vertices. Similarly one can deduce that for all other entries A^2 gives us the number of paths of length 2 between the corresponding vertices, and this will in turn equal the number of common neighbors that the vertices share. The other observation to make concerns the expression of the adjacency matrix of the graph's compliment. If you let J denote the all one's matrix and I the identity matrix, then a little looking over should convince you that the compliment's adjacency matrix is J-I-A. 

With these two observations and the definition of a strongly regular graph, the following equality holds: 

A^2 = lambda*A + u(J-I-A) + kI 

If this is not clear then remember that the number of two-walks from a vertex to itself is k in this case (so the k's along the main diagonal contributed by kI), the number of common neighbors of two adjacent vertices i and j will be lambda (hence the lambda*A contribution to A^2), and any two nonadjacent vertices in the graph will be adjacent in the complement and will have u neighbors in common (explaining the u(J-I-A) term which also contributes to A^2). 

Through a little bit of distribution and simplification this equation becomes: 

A^2 = (lambda-u)A + uJ + (k-u)I

Now remember that we are dealing with graphs here, and graphs have symmetric matrices. There is a result usually taught in matrix theory courses which says that the eigenvectors corresponding to the different eigenvalues of a symmetric matrix are orthogonal to each other. We already know one eigenvector, the vector of all one's. Any other eigenvector we find will be orthogonal to this. Suppose that for another eigenvalue then we have its corresponding eigenvector v and multiply both sides of the latest equation with that. It will then reduce to the following:

(A^2)v = (lambda-u)Av + (k-u)v

 Notice that the uJ term cancelled out. This is because when multiplying J and v you are essentially taking the dot product of the rows of J with v. The rows of J consist of all one's and v is orthogonal to the one's vector,so the dot products will result in just a bunch of zeros. 

We're almost there now! 

Suppose that y is another eigenvalue of our strongly regular graph. Our third equation will then become

(y^2)v = (lambda-u)yv + (k-u)v

which implies that: 

(y^2) = (lambda-u)y + (k-u)
This is a quadratic equation in y which can be solved to give us the remaining two eigenvalues for the original strongly regular graph! Cool! :-D

So that this post will not get too long I think that I will stop here and let the three distinct eigenvalues property suffice for strongly regular graphs (for the time being at least). Hope you all liked the post and the cartoon (like all others made in Paint by yours truly). As always, keep on graphing! :-)


Monday, July 22, 2013

Pi Approximation Day or Casual Pi Day

Hello dear readers and a Happy Pi Approximation Day (or as some also call it, Casual Pi Day) to all! :-D Now just like Yellow Pig Day I had no prior knowledge of this math holiday. Before Yellow Pig Day the only math holiday that I knew about was Pi Day. Imagine my surprise then upon finding out about another holiday related to Pi Day while perusing the internet for special events in July. 

Excitement and wonder aside there is some explaining to do regarding Pi Approximation Day. First of all, why is it held July 22nd of all days? The answer is clear if you look at its date number 7/22 and take the inverse of that (or just look at it European style). That will give you 22/7 which is a pretty good approximation to pi (better actually than the 3.14 provided by the digits of the well established Pi Day. If you do the division you will see that 22/7 comes out to about 3.142857142857143. Besides this slightly closer approximation to pi I also find the notion of a pi approximation day more appealing than just a pi day since the best we can do with a finite number of digits is approximate the constant. Besides that, I also find the holiday appealing from a computer science perspective. It's natural to wonder how people have tried to approximate pi over the years and how people have been able to generate all of the immense digits of pi that are known today. 

The literature on numerical methods for approximating pi is immense. A quick browse through left me getting excited over topic after topic after topic, and unsure what would be appropriate for this post. After discarding the notion of trying to write a comprehensive post on such methods, I decided to expand on a historical note on the topic (people have been trying to approximate pi since antiquity), which brings me to a name I'm sure many of you have heard: Archimedes. In relation to pi, he managed to produce surprisingly accurate bounds on the constant, approximating it to be greater than 223/71 and less than 22/7. His methods of proof involved a bit of geometry, specifically regular polygons with many sides inscribed within and around circles. 

In much more recent years (relative to Archimedes' time) a very elegant mathematical proof surfaced of the upper bound: pi < 22/7. As Lucas notes, (see reference below), the integral that implies the result was first officially published by Dalzell in 1971, and may have been known a little earlier in the mid 60s by Kurt Mahler. This very special integral is shown below in my paint drawing (sorry for the rather unelegant drawing of the otherwise elegant integral; I don't believe that blogger has special symbols for mathematical formulae and I don't have any special math writing software). 


With a little bit of Calculus 2 knowledge the integral will not be hard to evaluate (though maybe a little time consuming to expand in the numerator). To evaluate you expand the numerator and divide by the denominator. After long division there will be one term at the end that has a (1 + x^2) in the denominator. The thing to remember with that is the antiderivative which is the arctangent function. Evaluating the expression at the upper and lower bounds and taking limits as necessary will give the answer of 22/7 - pi. The answer will be positive because the integrand is positive and the upper integration limit of 1 is greater than the lower limit of 0. Together you get the awesome magical inequality of 0 < 22/7 - pi which implies that pi is less than 22/7. Pretty cool. 

Some readers may wonder where such a magical integral could have come from. Did it just pop out of the sky or is there some deeper mathematical pattern underlying all of this? It turns out to be the latter. The above integral can be generalized to arbitrary powers m and n for x and (1-x). This generalization was recently introduced by Backhouse and can provide many more approximations of the constant pi. The reason this is so is because it will evaluate to ax + b*pi + clog2. The coefficients a, b and c are all rational, a and b have opposite sign, and if it is the case that 2m is congruent to n modulo 4, then c will equal zero (Backhouse proved this) and we once again have a result like ax + b*pi > 0. As m and n increase the results to pi will become finer and finer. Even more so, it turns out that this is not the only such integral that provides approximations of pi. A skimming over the Lucas reference provided below will show another similar integral as well as a table comparing the approximations provided by the two integrals with increasing values of m and n combined.

And that's that my dear readers. It would be great to continue exploring approximation methods to pi, but if you are in more of a casual pi day kind of mood then I recommend snacking on some tasty tarts (especially pecan: my favorite!, they kind of look like/approximate pies too) lay back, listen to some Herb Alpert, and take it easy. :-) 

References: 
1.  N. Backhouse, Note 79.36, Pancake functions and approximations to , Math. Gazette 79 (1995),
371–374.


2. S.K. Lucas, Integral proofs that 355/113 > pi, Gazzette Aust. Math. Soc.
32 (2005), 263–266.

Thursday, July 11, 2013

Rocky Mountain Mathematics Consortium Summer Trip

It was for the latter half of June that I had the privilege of attending the University of Wyoming's annual Rocky Mountain Mathematics Consortium (RMMC) summer program. It lasted for about a week and a half, and everyday (except for the weekends) was filled with three main lectures by notable researchers in the selected RMMC topic. Graduate students and a few professors took the time in between lectures to present contributed talks. What attracted me to the program was the topic: algebraic graph theory. On my own independent readings I had come across material discussing the relations between a graph's spectra and the properties of said graphs (e.g. graph connectivity, bipartiteness, vertex valencies...). It amazed me at those times how much you could say about a graph by looking at the eigenvalues of its adjacency matrix (it still does now!). Given this short enthusiastic blurb on graph spectra along with all of my previous graph theory posts, I'm sure my readers can imagine how incredibly stoked I was when I received word of acceptance to the conference/program.

The three lecturers for RMMC were Chris Godsil of the University of Waterloo, William Martin of Worcester Polytechnic University, and Joanna Ellis-Monaghan of Saint Michael's College. Their lectures covered material on quantum graph walks, coding theory, and graph polynomials respectively. I purchased a notebook before my trip to Wyoming expecting that I had more than enough paper for RMMC, but at the end of it all I had a notebook completely stuffed with graph theory ideas; not an inch of white space to spare! Among the topics discussed (both from lectures and from contributed talks) were the following (and do note that this is only a small sample listing; the list goes on and on and on!): edge deletions and contractions, the Tutte polynomial, strongly regular graphs, Hamiltonian cycle detection, graph energy, the spectral decomposition of graphs, Hadamard matrices, n-cube graphs, graph toughness (this was actually discussed the Sunday before the program actually started), linear programming methods for the social networking influence maximization problem, eigenvalues of directed graphs, and some knot theory basics. Phew! :-0 To review the daily material presented in main lectures, we also had a problem solving session at the end of every other day. During these problem solving sessions we could ask the lecturers questions about the problems they made up and collaborate with other students on them.

Among the many graduate students I met at the conference, there were two I spent many of my problem solving sessions with and got to be great friends with. Pictured below are the three of us. That is of course me in the middle, and to my left and right are Saba and Danielle. 



The picture setting was a banquet that RMMC held for all of us during the last week we were there (hence the fancy glasses and table cloths). It was a great dinner and a time to take lots of pictures and relax after all of the hard studying and work. RMMC was a great experience and it is very possible that I will discuss it or at least some of the topics presented there more in future posts. :-)