YTread Logo
YTread Logo

Daniel Spielman “Miracles of Algebraic Graph Theory”

Jun 04, 2021
Well, welcome to our next plenary conference. It is my pleasure to introduce you to Daniel Spielman, a famous theoretical computer scientist and applied mathematician at Yale University. Filmin Professors received his bachelor's degree from Yale and his PhD from MIT and then taught at MIT for 10 years, first as a professor. He was an assistant professor and then an associate professor and moved to Yale in 2006, where he is currently an excellent professor of computer science and also a professor of statistics and data storage and also mathematics. He has an incredible amount of major award hardware, including the Girdle Award, two-time Neven Leena. award for fluid analysis of linear programming and algorithms for

graph

-based codes and in 2014 the anti-polio award with Marcus and Srivastava for their joint solution to the singer Kazon problem, a famous question and a functional analysis that has been open for more than 50 years, it is Sir, its awards. he was elected to the National Academy of Sciences in 2017 and several others, but I think we get the idea anyway.
daniel spielman miracles of algebraic graph theory
Here's Daniel Spearman talking about the

miracles

of

algebraic

rap

theory

. Thanks, so I'll use this talk as an opportunity to give you an introduction. to the

theory

of spectral graphs and external municipalities, it is a field that when I first encountered it as a student really blew me away, it provides tools that I have used throughout my career and was the topic of the graduate course I took during a semester, so my goal is not to tell you about my work. I think I'll mention a result at the end just to tell you why I'm on stage talking about this business.
daniel spielman miracles of algebraic graph theory

More Interesting Facts About,

daniel spielman miracles of algebraic graph theory...

Rather I will try to tell you some of the results that inspired me to pursue this field and also try to convey to you some of the intuition I have developed for graph spectra over the years and the hope that they will make it easier for you to think about them. . Oh, in about 20 minutes. I'll explain that figure from now on, so let's first start by actually defining what spectral graph theory is. You take a graph whose main objects are, let's say, these circles that I've drawn here. I will refer to them, sorry, interchangeably as nodes or vertices. different communities use different language and somehow I've been stuck between both, these important parts, so they're connected by edges, so edges connect pairs of vertices, it's the fundamental combinatorial object in spectral graph theory, We relate graphs to matrices, the one most people look at first is the adjacency matrix, so the adjacency matrix has zeros and ones, it is symmetric, the rows and columns are indexed by the vertices, so this matrix It has eight rows and columns because there are eight vertices, there are some at the off-diagonal entries where there are edges in the graph, for example, this graph has that red border between vertex one and vertex five that corresponds to the one I enclosed in a red circle in row one and column five and row five in column one is 0 everywhere else.
daniel spielman miracles of algebraic graph theory
The idea of ​​spectral graph theory is that when you can take this matrix and look at its eigenvalues ​​and eigenvectors and, Surprisingly, they tell you fundamentally important things about the graph and not just things that people in linear algebra might be interested in, things that graph theorists are interested in are revealed by eigenvalues ​​and eigenvectors and this was just surprising to me because I think of eigenvalues ​​and eigenvectors as somehow these strange, continuous things and I think it's not immediately obvious, but I hope it's obvious in ten minutes that they should tell you something about the graph, okay, that being said, I also want to admit that this is a bit of an accident.
daniel spielman miracles of algebraic graph theory
I think it's an accident that the adjacency matrix is ​​actually useful. There are other matrices that I think I can make obvious. They are useful and are most naturally associated with A graph and the adjacency matrix are useful because in many of the fundamental examples that we think of it matches them, but I will tell you about other matrices so that you think that this is an example that we will ignore. Now, then, what do we do in spectral algebra graph theory? We understand a graph, so we see the matrices associated with it, and we can think about those matrices in many different ways.
We use them to define systems of linear equations related to a graph. we use them to design to find a quadratic form associated with a graph or an operator associated with the graph and if these are sufficiently natural objects associated with the graph then it makes sense that they reveal their structure now I will mainly talk about quadratic are formed in this lecture, so So before leaving this slide I just want to say a few words about the operators. One of the most natural operators to associate with a graph is the operator that governs the random walk on a graph, so the random walk is a process that every time step is at some vertex of a graph and what it does is which in the next time step moves to a random neighbor of that node in the graph.
If you want to understand random walks we generally look at the probability distributions of those, so that's a vector with a real value at each vertex between 0 and 1, they should add up to 1 if you know the probability distribution of a time step and want To know the probability distribution of the next time step, you get it by multiplying by a matrix that corresponds. To remember, sometimes the walk matrix looks a lot like the adjacency matrix, but usually it has a slightly different weight. If you're used to thinking about operators, you know that if you want to understand what happens when you apply one many times. is what you want to do if you want to simulate a random walk many times, the right way to understand them is by thinking about their eigenvalues ​​and eigenvectors or a very good way to understand them, okay, so if you think that random walks on grass they tell you If you have something that interests you about graphs, then you might think that the eigenvalues ​​and eigenvectors of the walk operator should tell you something about the graph, but we'll look at quadratic forms for this lecture.
I just wanted to spread that example and We'll see that these quadratic forms give us a lot of beautiful theorems about graphs. I am very interested in them because they are very useful for algorithms and I will tell you some algorithms that we get from them and that are also very useful in Rather deviated from machine learning, they give us a kind of heuristic to understand graphs on which we cannot test large theorems, but you will still see that at least they are useful and maybe one of you will come up with Theorem I. I have been looking for a long time to explain why they are so cool, so let's start by talking about linear equations that we can associate with a graph and I will derive them from a physical system, so let's imagine that a graph defines a network of springs, so think of each edge of the graph as defining an ideal linear spring with spring constant 1 and now there is a very natural experiment that can do with this spring network: you can fix some of the vertices and let the others sit where they are.
Physics is supposed to tell us that when the system is in equilibrium, when it settles, each vertex should be in the position that is the average of its neighbors or the center of gravity of its neighbors, which means that physics is telling us. saying that to understand where these nodes are. we have to solve a system of linear equations there is a linear equation for each non-fixed vertex saying that its position is the average of its neighbors we can also think of this in terms of quadratic forms when stretching an ideal linear spring with one constant at the length L has potential energy l squared over two and physics tells us that the position in which these nodes will land is the one that minimizes the total potential energy, so this will be a very important term in this talk, this potential energy I.
Call it Laplacian quadratic form, so let me explain what it is and this also allows me to define my notation if I want to capture the sum of the length of each edge. I need to add the edges. I use letters like a and B to denote vertices a pair of them a comma B is an edge for each edge. I need to record its length to be different than the square of the difference in the position of its endpoints. I use something like that this will be minimized subject to the very strict boundary constraints which are nails that I was asked not to nail into the screen, so I just drew them here on the slide.
This actually leads to one of the most surprising fundamental results:

algebraic

graph theory. It looks like an amazing putting article called how to draw a graph that dates back to 1963, so to start you need to think about what graphs you can draw. Tut was thinking about flat graphs. Those are the graphs you can draw on the plane to locate each vertex on the plane and draw the edges and straight lines. You can draw lines so that none of the edges intersect. You'll have a better idea in a moment, but the first thing they sing when someone gives you a graph you might not know if it's flat, so imagine they give you this. mess well, what are you doing?
Tut says he first identifies a face on it. I'll define a face precisely in a moment, but for now think of a triangle, if I can find a triangle on this graph. Tut suggests taking the vertices of that triangle and moving them to the corners of a convex polygon that is on the outside of this image and nailing them now let each edge be a spring and let them settle into the correct position it will eventually stabilize Tut showed that if the graph is flat and there are three connected I's. I will explain three connected ones in a moment.
This avoids some obvious degenerations. If there are three connected in a plane, then this will give you a flat drawing of the graph. There will be no crossed borders. Now you may not be able to see it now because there is a whole. A bunch of vertices that are all clustered somewhere in the middle, but I promise you it's flat. I'll make a better image so you can see it more clearly, so now I can tell you what the face is facing: the flat graph. Is one of the regions in a flat embedding or rather is it looping and closing them?
For example, this loop here because it encloses an empty region and the embedding is a face. You can do this for any face, so let's say. that face move it outward key the position of the vertices let the spring settle and now we'll end up with a flat drawing with no cross borders of this flat graph and this works for every flat graph so one of the reasons I was Amazed by this theorem, I thought my desire to draw a graph without crossing edges was human, but apparently Springs wants to do it too and it blows my mind, so I should tell you what connected 3 means.
What is the obstruction in this theorem? The graph has three connections if you remove it and you can move any two vertices and it stays connected. So if there are two vertices that you remove, it breaks the graph into two different parts and it is not preconnected. so this graph doesn't have three connected if I remove those red vertices I break the graph into two pieces if it doesn't reconnect if there are two vertices that I can remove that break the graph into two pieces then there is no way this spring and the clothes of bed is going to work because if you take a look at the component that you get when you delete those two vertices, all the nodes and that component will collapse into one line, so it's an obvious thing that can go wrong if you don't have this, so the touch theorem it works after it's reasonable for another for more reasons than you know it fails and things will crash if it doesn't come back online.
You can't even really define the faces of a planar graph if it isn't planar. preconnected, but what I mean is that if the graph is not preconnected, the set of faces is not really well defined, so think about this graph again and look at the top face, I mean the one enclosed by vertices 1 2 3 6 7 and 5 that is a face but there are other flat drawings of this graph where it is not a particular face. I can swap the location of nodes 7 and 8 and then 7 is no longer part of that face and is actually no longer on a face with vertex 2, so three connected tells you that you have a nice flat graph on the that the faces are well defined and then the touch theorem is applied.
Well, now let me remind you of a little Posse in quadratic form because that is the fundamental object for the rest of this talk, it takes as input a vector that assigns a real number to each vertex and returns the sum of the squares of the differences of that vector between the vertices, so for example here is a graph that you could assign a real number for each vertex, then we can see the sum of the squares of the differences across the edges, that's what gives you the shape Laplacian quadratic. As long as you have a quadratic form, there is somesymmetric matrix such that we call this case L for the Laplacian form. matrix such that you can write that quadratic form as X transposed L I put them here too so that the Laplacian quadratic form and the diagonals outside of it have a minus 1 for each edge, so for example the blue edge between nodes 2 and 6 corresponds to minus 1 and row 2 in the column 6 and row 6 and column? 2 the other diagonals are 0 where there is no edge and the diagonals are set to be the degrees, the nodes are positive numbers chosen precisely so that each row and column sums to zero, so that is the Laplacian matrix which of course also You can define the group matrix for weighted graph by weighted graph is the only thing, but weights on the edges, there are many reasons why you want to put weights on the edges.
If you want to indicate the strength of an advantage, if you are looking at a social network , indicates perhaps how important a link is or how frequent communication is between two people, if you are looking at a spring network that should be the spring constant o If you like to draw multiple graphs where you allow multiple edges between pairs of vertices , those don't encode very well as arrays, but you can achieve that effect by recording the multiplicity of an edge as its weight and then of course whatever you want. What you need to do is simply put the weight in front of the appropriate term in the quadratic form and in the matrix that will put minus that wait on the car that responds to the diagonal input.
By the way, I'm only going to consider positive or not negative. edge weights negative edge weights there are many different ways of doing and doing and each one has some flaws and they give it some different definitions so let's think about positive edge weights for now okay if I give you this matrix associated with the graph, we can then apply the spectral theorem the spectral theorem tells us that we have a real symmetric matrix that has or hermitian has n real eigenvalues ​​that is a wonderful thing if you don't play with symmetric matrices every day it is possible to forget it so I will remind you not only do you have an orthonormal basis of eigenvectors and they satisfy this fundamental equation that the Laplacian times the eigenvector is equal to thing i in value times that vector, that's how we first learned to define eigenvalues ​​and vectors. own and as I promise they tell you a lot about the graph and now you might have an intuition as to why, if you think about spring networks, what this equation tells you is that the eigenvectors are giving you the fundamental modes of vibration of that object and that's part of the why.
This is useful, but it is only part of it. I admit that I actually get a lot more use out of eigenvectors by applying the current Fisher theorem, so the crop Fisher theorem gives you another characterization of them, it tells you that the eigenvalues ​​and eigenvectors of a symmetric matrices They are the solution to the maximization and minimization problems and because they arise in these optimization problems, it is possible to use and improve many inequalities and relate them to the structure of the graph, so I will tell you what the current Fisher theorem is. gives first, tells you about the first eigenvalue, the smallest lambda, tells you that lambda one is the minimum possible value of the Laplacian quadratic form.
We have to normalize, let's take the unit vectors, the corresponding eigenvector is, of course, the vector in which that minimum lies. It is now achieved well for Laplacian matrices this is not very informative for Laplacian matrices lambda 1 is 0 and the corresponding eigenvector is the constant factor. You can see this because if you put a constant vector in quadratic form you will get 0 and this quadratic form. is always non-negative, so you can never get a value less than 0, then lambda 1 is 0 and V 1 is the constant vector, let's look at V 2, the second eigenvector is the minimum of the Laplacian quadratic form over all the unit vectors that are orthogonal to the first eigenvector and in this case that is very good because we know that it is the factor of all ones, so it is a very good characterization of the second eigenvalue and of course the vector is the vector at which that minimum is achieved and you can Continue in this way to define lambda 3 and lambda 4 and V 3 and V forever you take the vector that minimizes a quadratic form orthogonal to the ones you have already found, so this is part of the justification of a very beautiful heuristic for Understanding the weed to stop the boots is called spectral graph drawing and this is kind of the birth of graph drawing or the other birth of graph momentum besides tats theorem, like this I don't have wonderful theorems for this, but they give me a lot of intuition.
So I'll show you a bunch of photos. It's also a useful magic trick. I run an institute called the Yale Institute for Network Science. What this means is that from time to time people come into my office and say, Dan, I have this. graph for the network, can you tell me what it is? It's usually some text or data file, and you know, what do I do right? It could be a mess like this, but I use all the spectrograph that contains strong erasers. and everyone very well, you give me this beautiful picture of a graph and they are impressed and then they understand it, so this is what you do.
Paul said okay, v1 is not useful, let's look at v2 and v3, the second and third eigenvectors. of the Laplacian remember that each one of them is a vector each one gives you a real number for each vertex of the graph we will use them as coordinates of the vertices with which to draw them so take v2 use it to give it has a real value number for each vertex, use it to form the horizontal axis, take v3, use it to form the vertical axis, plot those nodes at the locations given by the eigenvectors and then draw the edges of the straight lines and I get this beautiful image of this graph and When I look at that beautiful image, I know everything about this graph.
I understand this graph when I see it drawn that way, unlike the horrible original image. Now you may wonder if this always works. No, but it often tells you something useful especially for graphs that you can draw let me do some other examples uh here I sampled some random points from the plane, it's a way to generate a graph, I took Dallona triangulation, so again , what I did was welded a triangle that shows sort of The prettiest graphs or one of the prettiest you can draw if I look at the edges of the triangles given that set of points, but here we start with the geometry.
I've fixed the vertex locations and then drawn nice edges, now let's forget about it. geometry just pay attention to what you are connected to what just using that information from the graph calculate the eigenvector plot the vertices and this is what we get it is a surprisingly nice image let me point out some features the middle looks like a triangulation of something Almost all the triangles have a good aspect ratio, which is surprisingly better than in the silly triangulation reform before it gets a little messed up at the limit, and in fact if we looked closely enough we would see that in reality it is not. plane at the limit, it looks like it wants to wrap some surface, but when you only give it two eigenvectors there is no way you can do it right, you need three dimensions to do it, but it gives you a remarkably good picture of this graph and this keeps happening, so here is one of my favorite examples.
This is the airfoil mesh. If you are a MATLAB user, towing load airfoil. This is one of the example matrices they give you. It comes from modeling the airflow around the wing of an airplane. -section of an airplane wing and that is a disk Roadization of the region on the left. I showed you the image of the original coordinates on the right, we forgot the coordinates and just used the eigenvalues ​​and vectors to draw this and again you. Look at this amazing picture, it has these giant holes, but that's because the airfoil graphic has these giant holes, so to show you how pretty it is, let me zoom in.
If we get closer to the edge we again see a beautiful triangulation in a giant flat region, so for a little over 20 years I've been looking at these images and asking if someone could give me a theorem that explains why this happens, why do we have these beautiful flat regions in these spectral drugs? I still don't know why there are some theorems that are starting to get us there, but they're not really enough, we probably need someone who understands differential equations in the finite element method much better than me. Okay, of course, you know this isn't always the case. it doesn't even work for all planar graphs so let's consider something like a graph we get of a platonic solid like the dodecahedron so take a platform ahed Rijn for each corner make a vertex on the graph for the edges along the dodecahedron put an edge on the graph that gives you a graph that has 12 vertices and you can consider drawing it using the eigenvectors and you get a kind of squished dodecahedron, but you had to do it and because you know it shouldn't embed well with two vectors because it's three dimensional object and actually lambda 2 has multiplicity 3 lambda 2 is equal to lambda 3 is equal to lambda 4 and my code was doing something incredibly arbitrary when it displays two vectors in this three dimensional eigenspace to form a basis, if you know, I imagine that Maybe in 10 years everyone would be wearing 3D glasses and I would have a 3D projector and we could look at the dodecahedron in three dimensions on stage.
If we plot it with three eigenvectors, we would get the dodecahedron exactly as expected. three dimensions if we use three eigenvectors and the same is true for every platonic solid in every dimension and all sorts of other regular polytopes and higher dimensions if it's in d dimensions take d eigenvectors and you get the picture fine but some graphs are just It's too complicated to draw, so here's one that's complicated. I took Paul or Dish as a network co-author, so what I did was take every article written by Paul Ornish. Each of his co-authors is a vertex in this graph that I did not include.
Paul artists because we already know it's in every paper, so for each of those papers I plotted a lead between each pair of people who were co-authors, it had a very large connected component, this is the largest nitride spectral drawing that we didn't get . a great picture, okay, some herbs can't be drawn well. In a moment I will tell you how to recognize them or at least how to demonstrate that some cannot be drawn well. You might worry if you take a look at it. this graph and you know enough about Paul's orders, you know that there are some vertices at a very high degree and you might think that's the reason, so let me show you that it's not.
Here's another graph that you can't draw well. I chose a random graph to regular. although I admit I see two vertices of degree three, it's basically a regular four graph, two edges landing on top of each other. If you take a random graph, it shouldn't have a good drawing, right? A good drawing would reveal a random structure and pattern. The graph should not have that structure. We'll look again at how to make it a little more formal in a moment, but for now let me tell you what I mean by a bad drawing. Well, for me a drawing is bad if I say many, many, many. the edges intersect and all the edges are long most of the pixels on the screen are dedicated to showing edges and not vertices, that should be a bad drawing, you can show that in the limit with large random graphs the overwhelming fractions of all the pixels have to be edges and that's not really a good drawing that you can use to make sense of it, so how do we demonstrate it well?
Let's say you have a nice drawing of a graph and a few years ago in my spectral graph theory class I asked students to prove a theorem like proving that if you have a nice drawing of a graph, then lambda 2 is small last year I went more friendly. I defined what I meant by pretty drawing a few years ago. I say you define it. This is a very solid homework problem that works. under all definitions of nice, but I generally mean something like most of the edges are short and the vertices are reasonably spread out, so some of them don't cluster too much anywhere, you can show that if there is a nice drawing a graph then lambda 2 is close to 0 how close can you show that if lambda 2 is large say more than 10 over the square root of the number of vertices then there will not be a good drawing of the graph so this It is useful for many purposes. particularly for graduate students and network sciences, you know, if your advisor says, give me a picture of this graph and you come back with an ugly mess and they tell you let's invent a better algorithm and you come back with an ugly mess and you keep doing that you know It's good to be able to tell your advisor, look.lambda 2 is big there is no good picture of this graph please give me something else to do it is very powerful so now let me tell you why this happens it is because the eigenvalues ​​in particular lambda 2 is connected to the grass boundaries and Let's see why this is so.
If I have a set of vertices in the graph like a skier, the limit is the set of edges coming out of that set, we can measure the limit using the Laplacian quadratic. form by applying it to the characteristic factor, so the characteristic factor of this set is the vector that is one of the 0 vertices elsewhere, if I insert it into the quadratic form of the little possum you will get exactly the size of the limit because the sum of the squares of the differences of the feature vector across the edges, well, it's 1, it goes between 0 and 1 for every edge on the boundary, so you get one for every edge on the boundary and every edge that's not on the limit goes between 0 and 0 or 1. and 1, so it does not contribute, so plotting the quadratic form helps us measure the limits.
If you have a good image of a graph, you can show that there will be a large set of vertices with small boundaries and then take the characteristic vector of that set orthogonalize it with respect to the vector of all ones and you will find through the test vector a proof of that lambda 2 is small using the cron Fisher theorem which tells us that lambda 2 is the minimum over orthogonal vectors of the DL vector of the quadratic Laplacian form. This motivated a heuristic that is part of what made spectral graph theory so popular: it was a heuristic for clustering and partitioning networks, so it was originally used in the scientific computing community where people doing things like studying the air flow around the wing of an airplane they wanted to split their simulations into large parallel computers, which meant that different computers would simulate different regions of space and they want them to be able to do their simulations with little communication and the amount of communication depends on how many edges there are between those different pieces, so they want to split a graphic into two parts without having too many edges running through them.
The heuristic people came up with the idea of ​​looking at the second eigenvector, taking v2 and now looking at the level sets of it, so for example here s is the set of all vertices where the value is greater than 0, 8. We take a look at the level sets we tested. them, one of them usually gives us a good partition to be a little more formal. Which is more formal. What we're doing is we're taking a look at this drawn spectral graph and they say take a look at all the vertices. on the left side of some line or on the right side and this addition is some line through this drawing of the spectral graph is a good partition, so it was found that there is a lot of experimental support.
This is good, you can actually use it as a chigger theorem. or an extension of Achievers' inequality proven through I have worked before these papers to show that this actually gives you a good approximate solution to a problem. I just have to stay on exactly what the problem is before I do it, let me show you this is the partition. you get the airfoil mesh from that last slice, okay, so to accurately express the claps in equality, I needed to find the conductance of a set of vertices. It's a way of formalizing my notion of a large set of vertices with a small boundary, so the conductance of a set of vertices is the size of the boundary of the set divided by the size of the set, unless the set has more than half the vertices, then we want to take a look at the size of the complement, so take a look at what you're removing from the graph and how many edges it takes to do it, that's what we call the conductance of this set.
People want low-conductance cuts, those are the ones where many vertices can be removed without cutting too many edges and minimal conductance. close to the minimum conductance, set it as graph conductance. Cheevers inequality provides an amazing relationship between conductance and lambda2, so to say this, I assume that for now all edges have weight one and each vertex has degree at most D, so they are nothing more than the edges that touch well each vertex, the lower limit is easy, the fact that the conductance is at least lambda two over two comes from taking the characteristic vector or thinking of a set orthogonalize it with respect to the vector of L and apply it Fisher, then it really is the side right which is Cheevers inequality that achieves the approved locus, they are not chips, they are my peers in Romani and varieties.
This is an extension of the Sugar inequality to graphs that was obtained by different groups of authors and with a slightly different flavor. A decade later it says that the conductance is at most the square root of two times the maximum degree multiplied by lambda 2 and also the procedure that I showed you on the previous slide gives you a set of vertices with that conductance that satisfies that limit so they can use v2 to find sets of low-conductance vertices, know this, I found it incredibly useful in many applications and after both sides of this inequality are useful to me, the left side is incredibly useful because well, one if lambda two is large, it tells you the conductance. is high, that means there is no big structure in your graph, you won't find communities, you can't split it, sometimes you want to know there is no community structure in your graph, it also means if lambda is big, your graph is what We call an expander and random walks mix quickly and many beautiful things happen.
The right side on the other hand heals me, if lambda two is small then I can slice my graph just fine. I can find low-conductance arrays and stuff. It allows me to analyze things. I should mention that both sides of this inequality can be fitted at least asymptotically. I haven't given you the more defined constants, but phenomena can occur, so on the left side you can have a conductance close to lambda two and a. A good example of this is the complete binary tree, so you form it by taking one vertex, you give it to the children, you give those two Toads, two children, those there are no two children, you continue, but let's stop and make it finite and have n. vertices, the conductance of this graph is about two and if you cut that red edge, one of the edges attached to the root, you will get a conductance about two over N and lambda two is about one and n, so in this case the conductance lambda two almost the same, in contrast, if you take the path graph, you just know that it has vertices on the line, adjacent nodes connected, it has more or less the same cut, it has conductance about two over n cuts an edge in the middle, but lambda two is on the order of one over N squared well it's very close to PI squared over N squared so you can also get the quadratic extremum of the cheater and I and many other people have been wandering for many years , what is your precise way of explaining what lambda2 is.
I'm actually doing it because this quadratic gap is a little perplexing, so I want to tell you about a theorem that was published last month that gives us that characterization, but first let me sharpen Sugar's inequality a little bit. I want to refine the notion of conductance and refine the eigenvalues, so I will consider general weighted graphs and for weighted graphs I will measure them now. I will talk about the degree of a vertex is the sum of the weights of the edges attached to it, the degree of a set of vertices I'll simply find that it is the sum of the degrees of the vertices in the set and now my refined notion of Conductance will measure the sum of the weights of all the edges leaving a set divided by the minimum D of s or the complement, whichever it is. we take whichever is smaller, which is a somewhat refined notion of conductance.
We also want to change the matrix a little bit using the normalized Laplacian, so this is the ordinary Laplacian multiplied by inverse d, where d is the diagonal matrix of vertex degrees if we know these agaroses. that matrix we get rid of this D term from the chigger inequality, it makes it a little better. You see that the conductance is at most the square root of 2 times lambda 2. Well, that was our first step in refining this old thing. Here's the new one. um, it appears in a paper by Aaron Shield, who is a graduate student at Berkeley finishing up this year, I might add, and actually, oddly enough, there was a paper posted a few days later in the archive by Miller walking to NAND Wang who has some other theorems that also imply this result and I still don't know why two people thought of the same thing 40 years after cheaters' inequality or more, but anyway now we really understand what lambda 2 is doing, so To explain it I need to tell you a little about equivalent networks. physicists consider this, they say, let's say you have a Spring network and you want to understand how it behaves on only a subset of nodes, so they consider removing the other nodes that you don't care about and they know you can actually understand this. network on the nodes you are interested in, removing the others and drawing another spring network, so that when you remove the nodes, you actually get another spring network.
Many of us learn about this in physics, it's about what springs are new in the series, you know if you have a series. of Springs and you can treat it as a big spring between the endpoints with a much lower spring constant or you learn about Springs in parallel or if you are very lucky you learn the Y Delta transformation and that allows you to eliminate everything and it actually corresponds to elimination Gaussian in the Laplacian matrix, so the shield theorem will interpret lambda 2 in terms of conductance, it is not necessarily the original graph but in terms of the effective graph on a subset of the vertices or the equivalent network, so what it says is for In every graph G there are two sets of vertices s 1 and s 2, so if we look at the equivalent network at their junction, then the conductance of s 1 and that equivalent network is a constant approximation of lambda 2 in the original graph perhaps there should be Writing this backwards, chef said that lambda 2 is approximated by the conductance of s 1 in the equivalent network, but this is a very clear characterization of lambda 2, so to explain what this does for the path graph, let's say that we have a path graph again this one there is a big discrepancy between conductance and lambda 2 lambda 2 is like 1 and the conductance N squared is like 2 and n remove the middle third of the vertices if you remove the middle third of the vertices you replace those n over 3 vertices times 1 spring, that's very weak, now it's the spring constant of 3 over N and now if I look, let's just say the left half of the vertices, the set s 1, which is n over 3 vertices, They are connected by one.
Frank's very weak spring like 3 over N to the other side to the other side, so it has conductance of the order of 1 over N squared. Well, that's at least a quick example of the shield theorem that explains the phenomenon people have been observing. Long, long time, ok, I'd like to tell you something completely different now that eigenvalues ​​and eigenvalues ​​are good for that's the graph isomorphism problem, so you think if you understand anything about graphs you should be able to tell when two of them . they are the same take a look at these two graphs, you know, are they the same graph?
Well, I know they are, there are different drawings of the Peterson graph and I can give you a labeling of the vertices. Once you see that labeling you will see that these two are the same graph the graph isomorphism problem is asking if there is a labeling for two graphs to be the same it is a disturbingly difficult problem okay as a computer scientist I can't Proving it to you is difficult, but I can tell you there is a big sport of algorithms that we can prove always work. In fact, there was a breakthrough two years ago and Bob I showed that he came up with the fastest algorithm known.
It's not polynomial time, but it is less than exponential. It's somewhere. In between there are many heuristics that help solve this problem in practice and eigenvalues ​​and eigenvectors can help and that's why I want to tell you a little bit about this. First let me tell you why you should care about this, the reason why this is Hard means, in general, knowing if two things are the same. Difficult. There are many times you want to measure the distance between some objects or have a notion of how similar they are, but you can't even know if they are the same.
You're not going to have any success with that, at least algorithmically speaking, so, for example, you could say I give you some shape in D-dimensional space and I give you another one that is the rotation of another well, which is equivalent to this test problem two. The graphs are isomorphic so it will be difficult at least in the worst case so let me tell you how I get values ​​and the eigenvectors can help first notice that changing the labels does not change the eigenvalues ​​so if two graphics of differentown values ​​are different. that already labels you to distinguish most graphs, but it doesn't necessarily tell you when they are the same.
Okay, permuting labels and switching labels only permutes inputs and eigenvectors or, to put it more precisely, if you permute the labels in a graph, it doesn't. Don't change those drawings of spectral graphs that I was showing you before so you can go a long way towards proving that two graphs are isomorphic or finding the isomorphisms by taking a look at the spectral drawings of them and if you really want to push this, do the type of the limit using correct computational group theory, well, there is a theorem by Babur Grigoriev and Mount that tells you that if all the eigenvalues ​​of a graph have a multiplicity bounded by a constant, then you can prove the isomorphism in time polynomial and I'll give you These two hints of how this is done partially give you more intuition about some of the strange things that can happen and the eigenvectors of graphs, so let's consider the extreme case of bounded multiplicity, each eigenvalue has multiplicity 1 , which means that each eigenvector is determined up to sign but only up to sign ATM if v is an eigenvector - v is an eigenvector, which means that for any graph you were drawing this way there are four possible drawings, so Now, if you give me two graphs, you refer to this graph and give me another one, I should be able to tell you if they are isomorphic or not, I would make these spectral drawings and I say it's one of those for the same drawing and if they are good, then great because it shows me a way to align the vertices one by one and it tells me the isomorphism if the drawings are different from what I know the graphs are different unfortunately it doesn't always give me a way to align the vertices I was very lucky in these images in which each vertex is assigned to a single place, but if two vertices land in the same place or many vertices land in the same place, then this does not help me fully resolve the isomorphisms and that can happen, let me skip this slide, so here is an example, first I also want to mention that there may be many, many oh I'll come back you were right the reason for this slide I can make it rhyme in a little bit simpler way instead of worrying about isomorphisms right now let's think about fall morphisms . trying to think of all the isomorphisms of a graph itself finding them is equivalent to finding solve checking some work for some problems is a bit easier to think about is complicated by the fact that some graphs have many fall morphisms, think of Graff has different eigenvalues, but if I invert nodes 1 and 2 I get the same graph, if I invert nodes 4 and 5 I get the same graph, then I invert nodes 6 and 7 I get the same graph, then I already have 8 automorphisms in 7 nodes and you can make bigger and bigger examples, you can have an absurd number of Otto morphisms, so we don't want to compute them all, but rather I want to compute a set of generators from the fall morphism group of a graph and that's how useful we can be do and we can construct them by looking at the eigenvector, so here I took this graph and put the eigenvectors in the columns the first one. one is the constant eigenvector, it doesn't tell you anything, but let's look at the next one, it has these entries minus 2 point 3 7 in nodes 1, 2, 3 and 4 and those entries do not appear in the rest of the eigenvector, now we know that permutations will simply be or fall orphans will simply permute the entries in the eigenvector so that those nodes we can identify our class.
What we do is use the eigenvectors to find classes of vertices and then once you defined a class of vertices, you look at the full set of eigenvectors and find the group of I Should morphisms that act on this in this case c2 crosses V 2 and we do that for each class of vertices 5 and 7 or a class and well 8 is its own class 6 is its own class and then use some elementary computational group theory after you find it to assemble them so that it divides the vertices into classes looking at the eigenvectors, find the group of automorphisms for each class and then have to take their intersection to find the group of eigenmorphisms if the complete graph.
The main thing to know is that if all the eigenmultiplicities are bounded, then you can do this pretty quickly using some elementary computational group theory. Well, to finish, I will tell you just one of my results that deals with. with when graphs are approximations of each other now okay, I said if I can't tell if two graphs are equal, how can I tell if their approximations of each other are wrong? To make this easier, let's fix the vertex labeling so that it keeps all vertices in the state just imagine changing the edges in this case we say that one graph is an epsilon approximation of another if their quadratic Laplacian forms are approximately equal , but what I mean is that for all real vectors the limit of each set of vertices is approximately the same, at least if you count the edges by weight, so you could have much fewer edges, but if they have higher weights then you can have the same limit, meaning that system solutions of linear equations in these matrices are approximately the same and it also means that their behavior of the spring networks in the graphs is approximately the same.
So one of my theorems that I really loved was working together with Josh Batson and Akhil Shrivastav. We show that each graph can be approximated by a sparse graph, so you fix epsilon, if you have a graph on n vertices, it means you can find an epsilon. approximation of this basically for n over epsilon square edges, which means that you can't not need most edges and you can still keep and preserve almost everything I want to know about the structure of a graph, so for example if I take the entire graph at 10 vertices, the correct sparse approximation is the Peterson graph and you will notice that I have drawn the edges of the Peterson graph a little thicker because we increase its weight, think of it as a mass conservation phenomenon if I'm going to get rid of some edges.
I have to increase the weights of others in general. These shiny, sparse approximations of large, full graphs are expanders. If you don't know what an expander is, think of a random regular chart like the one I showed you before. they all have large eigenvalues ​​and you can show their close approximations as a complete graph. If you do what an expander is, then you know that the best expanders are Ramanuja and expanders and you are probably wondering how they compare. These have a little less than twice as many edges as Ramanuja, an expander that you would use to approximate the entire graph, okay, so if you want to learn more, search my website and you will find a lot of useful things.
I have some links to popular articles related to this material. In other talks, I talked a lot about spectral graph theory. There are many links to my lecture notes from my spectral graph theory courses, and if you really want to dig deeper, I have kept a list of many articles on related topics that I like related to scarce vacations. graph grouping, division of laplacian equations and more and you will find the link, thanks for some questions if we have any. Oh, what happens if you make those springs on graphics that aren't flat? Okay, so if you try this spring on graphs that's not any cleaner, so there is an article and I'll be embarrassed if I forget the list of co-authors right now that addresses some reasonable generalizations of this, at least two graphs that they can be drawn very well in higher dimensions, so there are some generalizations that are interesting, okay, graphs that are not clearer when you try to do it on the plane.
I can't say anything so nice. These are reasonable guesses. I don't like what happens if you have a graph that you can actually draw on a Genus G surface. Okay, you'll probably have to make some decisions about what you pin, but after that I think you'll find something cool that you should know. I'm not really sure if it's there. There are some generalizations of the spectral graph that are based on some analyzes that have been done for surfaces of genus G and graphs that can be drawn on them, but it's a good question, there are probably good things you can say, yes, that was almost exactly my question any new questions, yes just one last thing, so what is the computational complexity of calculating Epsilon lollipops?
Oh, that's a good question, so right now it's almost linear time, but it's not practical to implement, so there's an amazing article by yin Tetley and his son that was maybe published two or three years ago and yes, some of my students and I have tried it with mine, we can't believe it is theoretically almost linear 900. There is also very good heuristic code out there, so you can find code that apparently does a very good job. Good job and we can also check if you have a good approximation that is really practical and fast in time and log squared and that's fine.
I didn't put a link advertising my software package, but I have a package called CL Laplacian that allows you to compute a lot of this stuff, and in particular quickly compute eigenvalues, check approximations, and things like that, so you can use it to check if you have it. . There is a very fast algorithm based on random sampling that is almost as good in terms of approximation quality, but it loses a log n, so there are some things you can do, but the best ones we are still waiting for good code that we can demonstrate which works and implement if you use more vectors than I can and then project it on a plane or in 3D, will you get any interesting drawing that way?
Oh yeah, okay, you can use more eigenvectors and get great images and I've actually spent a lot of time drawing them and rotating them in 3D, you can sort them. Turning and having an idea of ​​40 is really difficult, but who knows, maybe with some virtual reality technology or something we can also think about it soon. I mean, employment. I know we implemented 2D drawing that we implemented. code to make I have code to make 3D images that you can rotate. I don't remember if I put it in the package, but if I didn't I emailed myself because it's just two or three more lines of code about what's there, okay?
Well, let's thank our speaker again for a great talk.

If you have any copyright issue, please Contact