Complementary Vanishing Graphs
Abstract
Given a graph with vertices , we define to be the set of symmetric matrices such that for we have if and only if . Motivated by the Graph Complement Conjecture, we say that a graph is complementary vanishing if there exist matrices and such that . We provide combinatorial conditions for when a graph is or is not complementary vanishing, and we characterize which graphs are complementary vanishing in terms of certain minimal complementary vanishing graphs. In addition to this, we determine which graphs on at most vertices are complementary vanishing.
Keywords: Graph Complement Conjecture, Minimum rank, Maximum nullity, Inverse eigenvalue problem for graphs
AMS subject classifications: 05C50, 15A18, 15B57, 65F18.
1 Introduction
Let be a simple graph with vertex set . The set of symmetric matrices described by , , is the set of all real symmetric matrices such that for we have if and only if . Note that no restrictions are placed on the diagonal entries of . Given a graph , the inverse eigenvalue problem of a graph (IEP-G for short) refers to the problem of determining the spectra of matrices in . A large amount of work has been done in this area, see for example [12, 11, 5, 1, 7, 6].
Specifying exactly which graphs can achieve a particular spectrum is generally hard, and because of this, much of the work in the IEP-G considers parameters that measure more general spectral properties. For example, the minimum rank, , of a graph is defined as the minimum rank of a matrix . The maximum nullity, , of a graph is defined as the maximum nullity of a matrix . Since is closed under translations by for real values , the parameter is also equal to the largest multiplicity of an eigenvalue of . It follows from the definitions that whenever is an -vertex graph. The minimum rank and maximum nullity of graphs was one of the main subjects of the 2006 American Institute of Mathematics workshop [3]. This workshop was a catalyst for research on the IEP-G, and there the following conjecture was posed.
Conjecture 1.1 (Graph Complement Conjecture).
For any graph of order ,
where denotes the complement of . Equivalently,
Conjecture 1.1 is an example of a Nordhaus–Gaddum problem, which are problems where one tries to prove bounds for where is some graph-theoretic function. An analogous Nordhaus–Gaddum conjecture for the Colin de Verdière number (defined in [8]) was made by Kotlov et al. [14]. Levene, Oblak and Šmigoc verified a Nordhaus–Gaddum conjecture for the minimum number of distinct eigenvalues for several graph families [15].
Conjecture 1.1 is open in general and is evidently difficult. Some results related to Conjecture 1.1 for joins of graphs were studied and established by Barioli et al. [4], and the authors also showed that Conjecture 1.1 is true for graphs on up to vertices. Li, Nathanson, and Phillips [16] showed that it suffices to prove Conjecture 1.1 for a certain class of graphs called complement critical graphs. Theorem 3.16 in [3] implies that Conjecture 1.1 is true for trees. On the other hand, it was shown in [2] that the zero forcing number is an upper bound of , and it is known for any graph on vertices [10, 13].
With Conjecture 1.1 in mind, we make the following definition.
Definition 1.2.
A graph is said to be complementary vanishing if there are matrices and such that (the zero matrix).
The motivation for this definition is the following observation.
Proposition 1.3.
If is an -vertex graph which is complementary vanishing, then
In particular, Conjecture 1.1 holds for .
Proof.
By assumption there exist such that . Thus the sum of the nullities of and is at least . This in turn implies that as desired. ∎
The goal of this paper is to establish necessary and sufficient conditions for a graph to be complementary vanishing. Our main result characterizes which graphs are complementary vanishing in terms of a class of “minimal” complementary vanishing graphs .
Definition 1.4.
-
•
Let denote the set of complementary vanishing graphs such that both and are connected.
-
•
Let denote the smallest set of graphs containing which is closed under taking disjoint unions, joins, and complements111We will use to denote the disjoint union of two graphs . We recall that the join of two graphs is obtained by taking the disjoint union of and and then adding all possible edges between vertices of to vertices of ..
-
•
Let denote all the graphs such that either in or there exist distinct vertices with , , and .
Theorem 1.5.
A graph is complementary vanishing if and only if .
Roughly speaking, Theorem 1.5 states that a graph is complementary vanishing if and only if can be constructed out of minimal complementary vanishing graphs while avoiding a particular combinatorial structure.
Example 1.6.
Let be the graph obtained by adding a perfect matching to . We will use Theorem 1.5 to determine whether is complementary vanishing or not. It is easy to check that , so will be complementary vanishing if and only if . Observe that
Thus, to have , it would suffice to have , and it is easy to verify that this is the case. This proves that is complementary vanishing.
One can also use Theorem 1.5 to determine which trees are complementary vanishing.
Corollary 1.7.
A tree is complementary vanishing if and only if it is a star.
Proof.
If is a tree which is not a star, then there exists an induced path in where is a leaf. Note that , and . We conclude that , and hence is not complementary vanishing by Theorem 1.5.
If is a star, then (since is the disjoint union of ’s joined to a ) and it is not difficult to show that , so Theorem 1.5 implies that is complementary vanishing. Alternatively, one can show that is complementary vanishing directly by taking to be the adjacency matrix of and to be the Laplacian matrix of the complement of , since in this case . ∎
By Theorem 1.5, it suffices to characterize the set of graphs in order to characterize the set of complementary vanishing graphs. To this end, we determine exactly which graphs on at most 8 vertices are in .
Theorem 1.8.
A graph on vertices is in if and only if it is one of the graphs listed in Appendix A.
A precise break down of the number of pairs of connected graphs on vertices which are complementary vanishing is given in Table 1.
not complementary vanishing | complementary vanishing | total pairs | |
1 | 0 | 1 | 1 |
2 | 0 | 0 | 0 |
3 | 0 | 0 | 0 |
4 | 1 | 0 | 1 |
5 | 5 | 0 | 5 |
6 | 34 | 0 | 34 |
7 | 327 | 4 | 331 |
8 | 4917 | 32 | 4949 |
Organization and Notation. The rest of the paper is organized as follows. In Section 2, we give combinatorial conditions for when a graph is complementary vanishing or not. In Section 3, we prove Theorem 1.5 by studying a slightly stronger notion of being complementary vanishing. In Section 4, we prove Theorem 1.8 and describe several algorithms which can be used to test whether a graph is complementary vanishing or not. We close with a few open problems in Section 5.
Throughout we use standard notation from graph theory and linear algebra. Given a graph , we define the open neighborhood to be the set of vertices adjacent to in , and the closed neighborhood . We will drop the subscript on when the graph is clear from context.
We write for the zero matrix, which we denote simply by whenever the dimensions are clear from context. We use bold lower case letters to denote vectors, and in particular we write and to denote the all 0’s and all 1’s vectors.
2 Combinatorial Conditions for Complementary Vanishing Graphs
The results in this section use structural properties of graphs to determine whether it is possible for a graph to be complementary vanishing. Most of our conditions will show that a graph cannot be complementary vanishing since the diagonal entries for and are over-constrained. The following observation leads to our first condition for the diagonal entries of and . Here and throughout the paper, if is a set of vertices of a graph then .
Observation 2.1.
If , then and .
Lemma 2.2.
Let , . If , then or is for each .
Proof.
Because , taking the dot product of the row of and the column of gives
where the second equality used, for example, that if by definition of , and the third used Observation 2.1. Since , this reduces to
This implies or . ∎
The key fact in proving Lemma 2.2 is that is contained in (since this made the final sum equal to 0). This step in the proof can be generalized to pairs of vertices when the closed neighborhood of contains the neighborhood of .
Lemma 2.3.
Let and be such that . For distinct vertices , if , then we have the following.
-
(1)
If , then .
-
(2)
If , then .
Proof.
Let be distinct vertices with . Then we have . Taking the dot product of the row of and column of gives
Suppose . Then and (since ). Therefore, , which implies that . A completely symmetric argument gives the result when . ∎
We can also extract information from the dot product of the row of and the th column of if we “almost” have .
Lemma 2.4.
Let and be such that . For distinct vertices , if for some , then we have the following.
-
(1)
If , then and .
-
(2)
If , then and .
Proof.
Let . Then and taking the dot product of the row of and the column of gives
Suppose . Then we have that and (since ). This implies that where , , and . Solving for shows that . Moreover, by Lemma 2.2. A completely symmetric argument gives the result when . ∎
Applying both Lemmas 2.3 and 2.4 can lead to contradictions along the diagonal of . This gives the following corollary, which helps motivate the set defined in the introduction.
Corollary 2.5.
Let be a graph such that there exist distinct vertices with , , and . Then is not complementary vanishing.
Proof.
Lemma 2.2, Lemma 2.3, and Lemma 2.4 are the elementary calculations we use to show that particular graphs are not complementary vanishing. One improvement on these elementary calculations is to look for induced cycles with particular neighborhood properties.
Lemma 2.6.
Let be a graph that has an induced odd cycle on vertices and a vertex such that and
If and satisfy , then for some .
Note that is equivalent to saying that is adjacent to the entire “external neighborhood” of .
Proof.
Assume for the sake of contradiction that there exist such with for all . By taking the dot product of the row of with the column of , we obtain (writing indices mod and using the symmetry of and )
where we used that and the fact that the neighborhood of contains the external neighborhood of . Also note that each term in this sum is nonzero since implicitly we have assumed .
For convenience, define and for all . Notice that is an entry from if is even, and is an entry from when is odd. The relationship above can be rewritten as , where the indices of are written mod . In particular, this implies that amongst all of the terms , exactly of them are negative. Thus,
which is a contradiction. ∎
The following example shows the power of Lemma 2.6.
Example 2.7.
We end the section with some ways to construct complementary vanishing graphs. We say that two vertices are twins in a graph if . Note that we allow for twins to either be adjacent or nonadjacent.
Proposition 2.8.
If is a graph such that every vertex has at least one twin, then is complementary vanishing.
Proof.
Partition the vertex set of into equivalence classes of twins so that where every are twins. By the hypothesis, for . Let be orthogonal vectors that are nonzero on entries indexed by and zero otherwise. Since for , we know that the vectors exist for . Notice that for all .
Let and . Notice that every pair with distinct appears in exactly one of the sets or since every two vertices in and are twins. Let
and
where each term is a square matrix indexed by resulting from an outer product. Notice that , , and that
This completes the proof. ∎
Proposition 2.8 is tight in the sense that there are graphs which are not complementary vanishing where all but one vertex in has a twin. Indeed, consider with the copy of and the copy of . Then does not have a twin, , and . Thus, is not complementary vanishing by Corollary 2.5.
Lemma 2.9.
Let be a graph with and such that . If , then the graph obtained by adding a new vertex with is complementary vanishing.
Proof.
Let . For convenience of illustrating the matrices, we assume and write
Thus, can be expanded as
Let and , where is the all ’s vector in . Then we may construct the following matrices
where is either
depending on if or . Note that with this we always have .
By direct computation of each block of , we have
Along with the fact that and , we have that and are both complementary vanishing. ∎
Note that the above proof easily generalizes to duplicating a vertex any number of times or duplicating multiple vertices.
3 Proof of Theorem 1.5
In order to prove Theorem 1.5, we will need to understand when the disjoint union and join of two graphs are complementary vanishing, and to do this we need consider graphs which are in some sense “robust” with respect to being complementary vanishing.
3.1 Robust Graphs
A graph is said to be -robust if there are matrices and such that and contains a nowhere-zero vector. Similarly we say that is -robust if there are matrices and such that and contains a nowhere-zero vector. Note that -robust and -robust graphs are in particular complementary vanishing.
Proposition 3.1.
Let and be two graphs. Then the following are equivalent:
-
(1a)
The graphs and are both -robust.
-
(1b)
The disjoint union is -robust.
-
(1c)
The disjoint union is complementary vanishing.
Similarly, the following are equivalent:
-
(2a)
The graphs and are both -robust.
-
(2b)
The join is -robust.
-
(2c)
The join is complementary vanishing.
Proof.
Suppose and are both -robust. By definition, there exist matrices , , , such that and . Moreover, and contain nowhere-zero vectors and , respectively. Thus, we have
and . Moreover, the vector is a nowhere-zero vector in . Thus is -robust, showing that (1a) implies (1b).
A graph being -robust immediately implies that it is complementary vanishing, so (1b) implies (1c). To show (1c) implies (1a), suppose is complementary vanishing. Then there are matrices
such that for some matrices , , , , and some which is nowhere-zero. Thus, we have and . Since any column of is a nowhere-zero vector in , we know is -robust. Similarly, is also -robust, proving that (1c) implies (1a).
Notice that satisfying one of (2a), (2b), or (2c) is equivalent to satisfying one of (1a), (1b), or (1c). Therefore, the equivalence of (2a), (2b), and (2c) follows from the equivalence of (1a), (1b), and (1c). ∎
The next lemma gives an effective way for finding nowhere-zero vectors such that is also nowhere-zero.
Lemma 3.2.
If is a matrix which does not contain a row of zeros, then there exists a nowhere-zero vector such that is nowhere-zero.
Proof.
Suppose that is a vector in the column space of with the fewest entries. For the sake of contradiction, suppose that for some . Since does not contain a row of zeros, there exists some column of such that . Choose so that
Notice that has fewer entries than by construction. This contradiction leads to the conclusion that is a nowhere-zero vector.
Suppose that is a vector with the minimum number of entries such that is nowhere-zero (this vector exists by the previous paragraph). For the sake of contradiction, suppose that . Let be the column of . Choose such that
Recall that is positive since is nowhere-zero. Notice that has exactly one less entry than and is nowhere-zero by construction. This contradiction leads to the conclusion that is a nowhere-zero vector such that is also nowhere-zero. ∎
By using this lemma, we give some sufficient conditions for a complementary vanishing graph to be - or -robust.
Lemma 3.3.
Let be a complementary vanishing graph.
-
(1)
If has no dominating vertices, then is -robust.
-
(2)
If has no isolated vertices, then is -robust.
Proof.
Since is complementary vanishing, there are matrices and such that . First assume that has no dominating vertices. This means has no isolated vertices, and thus the column space of contains a nowhere-zero vector by Lemma 3.2. Having implies , so is a nowhere-zero vector in . Therefore, is -robust. An identical argument shows that if has no isolated vertices, then there is some nowhere-zero in the row space of . This implies that is -robust. ∎
Lemma 3.4.
Let be a graph. If there exist matrices , and a nowhere-zero vector such that and is nowhere-zero, then is -robust.
Proof.
Assume there exist matrices and vectors as in the hypothesis of the lemma. Let , which is nowhere-zero by assumption. Define
Observe that . Furthermore, since and . If , then is nowhere-zero and
Thus, is -robust.∎
Corollary 3.5.
If is complementary vanishing and does not have a dominating vertex, then is -robust.
3.2 Completing the Proof of Theorem 1.5
We first recall the statement of Theorem 1.5. Let denote the set of graphs such that is complementary vanishing, is connected, and is connected. One can check that . Let denote the smallest set of graphs which contains and which is closed under taking disjoint unions, joins, and complements. For example, having implies that contains every complete multipartite graph and every threshold graph. Let denote all the graphs such that either in or there exist distinct vertices with , , and . Notice that graphs in are not complementary vanishing by Corollary 2.5. Our aim is to prove the following.
Theorem 1.5.
A graph is complementary vanishing if and only if .
Part of Theorem 1.5 can be proven immediately.
Proposition 3.6.
If , then is not complementary vanishing.
Proof.
We will prove the proposition by minimal counterexample. Suppose that is complementary vanishing and a vertex minimal graph not in This implies that either or is not connected as otherwise This implies that either or where are non-empty complementary vanishing graphs by Proposition 3.1. Since is assumed to be a vertex minimal counterexample, it follows that . However, this implies that ; which is a contradiction. ∎
In order to prove Theorem 1.5, we need to keep track of the graphs in that are close to falling within . Let denote the set of graphs that contain a leaf which is adjacent to a vertex of degree at least . For example, stars on at least three vertices are in , as is a triangle with a pendant (notice that both of these graphs are in ). The main observation regarding is the following.
Lemma 3.7.
If and is a graph on at least one vertex, then . Furthermore, is not complementary vanishing.
Proof.
By definition of , the graph contains a leaf adjacent to such that there exists Notice that . Since is not the empty graph, there exists a vertex in such that . By Corollary 2.5, is not complementary vanishing. ∎
To prove Theorem 1.5, we prove the following stronger version to aid with an inductive proof.
Theorem 3.8.
Let .
-
(1)
If , then is not complementary vanishing.
-
(2)
If and , then is -robust but not -robust.
-
(3)
If and , then is -robust but not -robust.
-
(4)
If and , then is both and -robust.
Proof.
Statement (1): This follows immediately from Corollary 2.5.
Statement (2): Suppose that . Since , it follows that is not complementary vanishing by Lemma 3.7. By Proposition 3.1 and the fact that is -robust, it follows that is not -robust.
For the sake of contradiction, suppose that is disconnected. This implies that , and without loss of generality we can assume contains a leaf with a neighbor whose degree is at least since In particular, , and therefore, by Lemma 3.7. This is a contradiction. Thus, we can assume that is connected.
For the sake of contradiction, suppose that is a counterexample to the statement. If , then is complementary vanishing by definition, and it is -robust by Lemma 3.3 since is connected (and since because ). Therefore, . Thus, there exists non-empty such that or , and we must have since is connected. However, must have a leaf, which is only possible if . In this case, does not have a vertex of degree at least , which is a contradiction to .
Statement (3): Suppose and . Since is closed under complements, it follows that . By statement (2), we see that is -robust but not -robust. Thus, is -robust but not -robust.
Statement (4): Suppose and is a vertex minimal counterexample. It is easy to see that is both and -robust, so this is not a counterexample. Observe that graphs in are complementary vanishing with no isolated vertices or dominating vertices. Therefore, graphs in are both and -robust by Lemma 3.3. Thus we can assume that . In particular, or for some non-empty . Since is closed under complements, we can assume that without loss of generality.
Since , it follows from Lemma 3.7 that neither nor is in . If , then is -robust by statement (3), and if then is -robust since is a vertex minimal counterexample to statement (4). Similarly we conclude that is -robust, and hence by Proposition 3.1 we see that is complementary vanishing and -robust.
If does not have an isolated vertex, then we are done by Lemma 3.3. Therefore, we can assume that . Notice that if , then it is easy to show that is not a counterexample. Furthermore, if does not have a dominating vertex, then is -robust by Corollary 3.5. Thus, we can assume contains at least two vertices and a dominating vertex. However, this implies has a leaf which is adjacent to a vertex of degree at least two. This is a contradiction, since we assume . ∎
We can now prove our main result.
Proof of Theorem 1.5.
Suppose that . If , then is not complementary vanishing by Corollary 2.5. If , then is not complementary vanishing by Proposition 3.6.
Suppose that . There are three cases which exhaust all possibilities: either , , or . In any case, is complementary vanishing by Theorem 3.8. ∎
4 Algorithmic approaches and the proof of Theorem 1.8
In this section we introduce two algorithmic methods. The first uses Gröbner basis arguments to conclude that a graph is not complementary vanishing. The second is an algorithm that takes a matrix and checks for the existence of a matrix such that . Using this second algorithm together with a randomly selected (or ) will allow us to find certificates for a graph being complementary vanishing.
4.1 Gröbner basis
Let be a graph. Define to be the variable matrix whose -entry is a variable if is an edge or , and otherwise we set the entry to be . Similarly, we define a variable matrix with variables for . Thus, is equivalent to a system of polynomial equations for all and , where
With this in mind, we let be the ideal generated by all in the polynomial ring over (for computational purposes, we sometimes consider the polynomial ring over ). Recall that the Gröbner basis of an ideal is a particularly nice generating set for an ideal, see [9] for background on this. The only fact about Gröbner bases that we need is that there exist computer algebra systems, such as SageMath, that can be used to calculate the Gröbner basis of so that .
Suppose . Then we know a combination of generates , which is impossible when we assume for all . Therefore, if , then is not complementary vanishing. More formally, this argument implies the following.
Proposition 4.1 (Gröbner basis argument).
Let be a graph on vertices. Let and be the corresponding variable matrices. If the Gröbner basis of contains , then is not complementary vanishing.
The calculation of a Gröbner basis can be expensive. There are a few ways to reduce the number of variables:
- 1.
-
2.
By replacing by , we may assume one of the nonzero variables in is . (This variable can be an off-diagonal entry, or a diagonal entry that is guaranteed to be nonzero by one of our lemmas). The same argument applies simultaneously for the matrix .
-
3.
Pick a spanning forest of . By replacing by and by for an appropriate invertible diagonal matrix , we may assume the entries in corresponding to the edges of along with some (arbitrary) nonzero diagonal entry are . We may switch the role of and , but we cannot apply this to and simultaneously.
Note that this last technique relies on the choice of the spanning tree, and that the Gröbner basis depends on the choice of the monomial order, which often depends on the order of the variables. In most of the cases we have tried, we used the degree reverse lexicographic order of the monomials (which is the default setting for SageMath) with the off-diagonal variables preferred over the diagonal variables. We use this method to show the graphs in Figure 2 are not complementary vanishing.
Example 4.2.
Let and be the graphs in Figure 2. Assume and satisfy . According to Lemmas 2.3 and 2.4, we may assume
Moreover, we may pick a spanning tree of using the edges
By replacing with and with for some appropriate diagonal matrix , we may assume the entries in corresponding to along with are . Moreover, by replacing with for some nonzero , we may further assume . In conclusion, we have
and
By treating the remaining and as variables, the polynomial entries of generate the ideal .
We then find the Gröbner basis of under the degree reverse lexicographic order, which requires an ordering of the variables. Order the variables by
(and order the off-diagonal entries by lexicographic order). This means that the algorithm prioritizes elimination of first, and then , and so on. With these settings, the Gröbner basis of is . This means some polynomials in generates , which is impossible if they are all zero. Therefore, is not complementary vanishing.
Remark 4.3.
The same technique applies to and in Figure 2, so they are not complementary vanishing. For , one may choose a spanning tree in instead. In this case the Gröbner basis will contain a polynomial of a single nonzero variable, e.g., . Therefore, it again shows is not complementary vanishing.
4.2 Solving for
Let be a graph and . We present an algorithm for checking if there is a matrix such that .
Let be a graph with vertices and edges. We define as the topological closure of ; that is, consists of all real symmetric matrices whose -entry can be nonzero only when is an edge or . Thus, is a vector subspace of dimension in the space of real symmetric matrices of order .
When is given, define
which is a vector space. Our task is to determine whether the vector space contains a matrix . To do so, we need to check the existence of a matrix that is nonzero on the edges of .
Let be a basis of . Then every matrix in is zero at a given entry if and only if every matrix in is zero at the given entry. Moreover, when is a subset of indices, has a matrix that is nowhere-zero on if and only if for each index , there is a vector in that is not zero at index . With these observations, whether there is a matrix with can be determined through standard linear algebra techniques. This idea is summarized in Proposition 4.4.
Proposition 4.4.
Let be a graph, , and let be as defined above with a basis of . Then a matrix exists with if and only if for every edge , there is a matrix whose -entry is nonzero. Moreover, if the latter condition holds, then a random linear combination of with each coefficient in Gaussian distribution gives a matrix in with probability .
4.3 Proof of Theorem 1.8
We summarize some necessary and sufficient conditions for a graph to be complementary vanishing:
- •
- •
Not complementary vanishing | Complementary vanishing | ||||||
diag | oc | grob | twin | dup | certificate found | total | |
1 | 1 | 1 | |||||
2 | 0 | ||||||
3 | 0 | ||||||
4 | 1 | 1 | |||||
5 | 5 | 5 | |||||
6 | 34 | 34 | |||||
7 | 326 | 1 | 4 | 331 | |||
8 | 4905 | 8 | 4 | 6 | 14 | 12 | 4949 |
It turns out that these techniques are enough for us to classify all pairs with and connected on or few vertices. The results are shown in Table 2. Therefore, we have the following lemma, which one can check by computer is equivalent to Theorem 1.8.
Lemma 4.5.
We note that for one of the pairs in Appendix A, namely , we were unable to find a certificate using random trials; the only construction we know of for this graph was found manually.
We note that in principle, one could try to apply these techniques to determine which pairs of graphs on vertices are complementary vanishing. However, applying the diagonal conditions to graphs on vertices leaves to many possibilities for random trails to effectively resolve.
5 Conclusion
In this paper we began the study of complementary vanishing graphs. In view of Theorem 1.5, we have essentially reduced the problem to graphs such that both and are connected. In Theorem 1.8 we completely solved this problem for graphs on at most 8 vertices.
As things currently stand, there does not seem to be a simple description which categorizes graphs that are complementary vanishing. However, there are certain classes of graphs where one might be able to determine the answer. For example, it is natural to ask what happens for random graphs.
Question 5.1.
If is chosen uniformly at random amongst all -vertex graphs, does the probability that is complementary vanishing tend to 1 as tends towards infinity?
If the answer to this question is “yes”, then it is likely that this will be difficult to prove. Indeed, this would imply that the random graph satisfies the Graph Complement Conjecture asymptotically almost surely, which is an open problem.
Recall that there are four connected graphs on at most vertices such that their complements are also connected and such that these were shown not to be complementary vanishing using a Gröbner Basis argument. It would be nice if one could prove that these four graphs are not complementary vanishing by utilizing combinatorial techniques.
Question 5.2.
Can one use a combinatorial argument to prove that the graphs in Figure 2 are not complementary vanishing?
Acknowledgments. The authors would like to express their sincere gratitude to the organizers of the Mathematics Research Community on Finding Needles in Haystacks: Approaches to Inverse Problems Using Combinatorics and Linear Algebra, and the AMS for funding the program through NSF grant 1916439. J. C.-H. Lin was supported by the Young Scholar Fellowship Program (grant no. MOST-110-2628-M-110-003) from the Ministry of Science and Technology of Taiwan. S. Spiro was supported by the National Science Foundation Graduate Research Fellowship under Grant No. DGE-1650112.
References
- [1] B. Ahmadi, F. Alinaghipour, M. S. Cavers, S. Fallat, K. Meagher, and S. Nasserasr. Minimum number of distinct eigenvalues of graphs. Electron. J. Linear Algebra, 26:673–691, 2013.
- [2] AIM Minimum Rank-Special Graphs Work Group. Zero forcing sets and the minimum rank of graphs. Linear Algebra Appl., 428(7):1628–1648, 2008.
- [3] American Institute of Mathematics Workshop. Spectra of families of matrices described by graphs, digraphs, and sign patterns. http://aimath.org/pastworkshops/matrixspectrum.html, 2006.
- [4] F. Barioli, W. Barrett, S. M. Fallat, H. T. Hall, L. Hogben, and H. van der Holst. On the graph complement conjecture for minimum rank. Linear Algebra Appl., 436(12):4373–4391, 2012.
- [5] F. Barioli and S. M. Fallat. On two conjectures regarding an inverse eigenvalue problem for acyclic symmetric matrices. Electron. J. Linear Algebra, 11:41–50, 2004.
- [6] W. Barrett, S. Butler, S. M. Fallat, H. T. Hall, L. Hogben, J. C.-H. Lin, B. L. Shader, and M. Young. The inverse eigenvalue problem of a graph: multiplicities and minors. J. Combin. Theory Ser. B, 142:276–306, 2020.
- [7] W. Barrett, S. Fallat, H. T. Hall, L. Hogben, J. C.-H. Lin, and B. L. Shader. Generalizations of the strong Arnold property and the minimum number of distinct eigenvalues of a graph. Electron. J. Combin., 24(2):Paper No. 2.40, 28, 2017.
- [8] Y. Colin de Verdière. On a new graph invariant and a criterion for planarity. In Graph structure theory (Seattle, WA, 1991), volume 147 of Contemp. Math., pages 137–147. Amer. Math. Soc., Providence, RI, 1993.
- [9] D. A. Cox, J. Little, and D. O’Shea. Ideals, varieties, and algorithms: An introduction to computational algebraic geometry and commutative algebra. Springer, fourth edition, 2015.
- [10] J. Ekstrand, C. Erickson, H. T. Hall, D. Hay, L. Hogben, R. Johnson, N. Kingsley, S. Osborne, T. Peters, J. Roat, A. Ross, D. D. Row, N. Warnberg, and M. Young. Positive semidefinite zero forcing. Linear Algebra Appl., 439(7):1862–1874, 2013.
- [11] S. M. Fallat and L. Hogben. The minimum rank of symmetric matrices described by a graph: a survey. Linear Algebra Appl., 426(2-3):558–582, 2007.
- [12] F. R. Gantmacher and M. G. Krein. Oszillationslnatrizen, Oszillationskerne und kleine Schwingungen nzechanischer Systenze. Translated by Alfred Stohr, Berlin, Akademie-Verlag, 1960.
- [13] G. Joret and D. R. Wood. Nordhaus-Gaddum for treewidth. European J. Combin., 33(4):488–490, 2012.
- [14] A. Kotlov, L. Lovász, and S. Vempala. The Colin de Verdière number and sphere representations of a graph. Combinatorica, 17(4):483–521, 1997.
- [15] R. H. Levene, P. Oblak, and H. Šmigoc. A Nordhaus-Gaddum conjecture for the minimum number of distinct eigenvalues of a graph. Linear Algebra Appl., 564:236–263, 2019.
- [16] X. Li, M. Nathanson, and R. Phillips. Minimum vector rank and complement critical graphs. Electron. J. Linear Algebra, 27:100–123, 2014.
See pages 1- of appendix.pdf