Let . We shall assign an orientation to every edge and call this set of oriented edges. The orientation can be arbitrary.
is a matrix with where for
We set as a -matrix.
We consider the matrices as a real matrices and our matrix algebra will be with respect to . One can consider any other field instead of as well.
Further set , where the sums are to be considered as a formal sum. Observe that are- vector spaces and note that The canonical basis for is and for is . Here, we think of and similarly for other ’s and ’s.
We shall view as linear maps in the following sense :
Suppose we represent then . Further if the columns of matrix are then .
Assume that . We set
The above abstraction of representing as formal sums seems a little unnecessary as we could have also represented as functions from and respectively. However, this representation will be notationally convenient and is borrowed from algebraic topology where one considers richer structures giving rise to further vector spaces such as . Further, if we choose -coefficients instead of -coefficients, are modules and not vector spaces. Thus, the above representation allows us to carry over similar algebraic operations even in such a case. As an exercise, the algebraically inclined students can try to repeat the proofs with -coefficients. See [Edelsbrunner 2010] for an accessible introduction to algebraic topology and [Munkres 2018] for more details.
Trivially, we have that . Thus by the fundamental theorem of algebra, we have that where by we denote the rank of matrix/dimension of a vector space. It is easy to see that Further, we have that are linearly independent and hence form a basis for . Here linear independence means that if then . Observe that for ,
where in the first and third equalities we have used the definition of respectively and in the second equality, the linearity of . Thus, we have that and in other words . The same can also be deduced by noting that as matrix product is nothing but sum of rows of and hence .
Now, if we can understand then we can understand all the ranks involved. Observe that is nothing but the column space of of i.e., denoting the columns of the matrix by , we have that
Hence is nothing but the maximum number of linearly independent column vectors.
Suppose denote the edges corresponding to the linearly independent columns. Assume that the subgraph is connected. Let . Consider the incidence matrix restricted to . Call it . Since the non-trivial entries of ’s are on , the columns of are also linearly independent and so the column rank of and since is connected, . Let be the rows of the matrix . Since every column has exactly one and one , we have that and thus the row rank of . Since the row rank column rank, we have that i.e., is a tree.
Thus, we have proved that if denote the edges corresponding to the linearly independent columns, then every component of is a tree i.e., is a forest or acyclic subgraph. We show the converse and determine the basis for .
Let be the columns corresponding to in . Then is acyclic iff are linearly independent.
We have shown the ‘if’ part. We will show the ‘only if’ part. Suppose that is acyclic. We will show that are linearly independent by induction. The conclusion holds trivially for . Suppose it holds for all . WLOG assume that is a leaf edge with . Thus . If then since only, we have that . Thus and since are acyclic and hence by induction hypothesis, we have that are linearly independent i.e., . ∎
From the previous proposition, the following theorem follows easily.
Let be the (maximal) spanning forest in . Then
.
is connected iff .
By the rank-nullity theorem, we have that which is nothing but the number of excess edges i.e., edges added after forming a (maximal) spanning forest. Let form a cycle, denoting the reversal of an edge by , set
Thus, it is easy to verify that i.e., since ,
where the last equality is because is a cycle. Hence . We can extend this further.
Let be a maximal spanning forest in and where . Let be the cycles in respectively. Then form a basis for .
Another challenging exercise : In terms of matrices, the linear transformation represent right multiplication i.e., we have that
But we could have considered left multiplication and in which case we will have linear transformations ’s as follows :
Can you compute the ranks and characterize connectivity as we did above with ’s ?
See [Bapat 2010, Chapter 2] for more on incidence matrices.
We shall define the incidence matrix as follows : . In other words, is the unoriented incidence matrix. The following result explains the importance of considering orientations in the incidence matrix.
Let be a connected graph. Then if is bi-partite and otherwise.
See [Bapat 2010, Lemma 2.17] for a proof.