Product Code Database
Example Keywords: dungeon master -super $89
   » » Wiki: Graph Entropy
Tag Wiki 'Graph Entropy'.
Tag

Graph entropy
 (

In information theory, the graph entropy is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused.
(2013). 9783527670482, John Wiley & Sons. .
This measure, first introduced by Körner in the 1970s,
(2008). 9783540896883, Springer Science & Business Media. .
has since also proven itself useful in other settings, including combinatorics.
(1988). 9783540194026, Springer Science & Business Media. .


Definition
Let G = (V, E) be an . The graph entropy of G, denoted H(G) is defined as

:H(G) = \min_{X,Y} I(X ; Y)

where X is chosen uniformly from V, Y ranges over independent sets of G, the joint distribution of X and Y is such that X\in Y with probability one, and I(X ; Y) is the mutual information of X and Y.G. Simonyi, "Perfect graphs and graph entropy. An updated survey," Perfect Graphs, John Wiley and Sons (2001) pp. 293-328, Definition 2”

That is, if we let \mathcal{I} denote the independent vertex sets in G, we wish to find the joint distribution X,Y on V \times \mathcal{I} with the lowest mutual information such that (i) the marginal distribution of the first term is uniform and (ii) in samples from the distribution, the second term contains the first term almost surely. The mutual information of X and Y is then called the entropy of G.


Properties
  • Monotonicity. If G_1 is a subgraph of G_2 on the same vertex set, then H(G_1) \leq H(G_2).
  • Subadditivity. Given two graphs G_1 = (V, E_1) and G_2 = (V, E_2) on the same set of vertices, the graph union G_1 \cup G_2 = (V, E_1 \cup E_2) satisfies H(G_1 \cup G_2) \leq H(G_1) + H(G_2).
  • Arithmetic mean of disjoint unions. Let G_1, G_2, \cdots, G_k be a sequence of graphs on of vertices, with n_1, n_2, \cdots, n_k vertices, respectively. Then H(G_1 \cup G_2 \cup \cdots G_k) = \tfrac{1}{\sum_{i=1}^{k}n_i}\sum_{i=1}^{k}{n_i H(G_i)}.

Additionally, simple formulas exist for certain families classes of graphs.

  • Complete balanced k-partite graphs have entropy \log_2 k. In particular,
    • Edge-less graphs have entropy 0.
    • on n vertices have entropy \log_2 n.
    • Complete balanced have entropy 1.
  • Complete with n vertices in one partition and m in the other have entropy H\left(\frac{n}{m+n}\right), where H is the binary entropy function.


Example
Here, we use properties of graph entropy to provide a simple proof that a complete graph G on n vertices cannot be expressed as the union of fewer than \log_2 n bipartite graphs.

Proof By monotonicity, no bipartite graph can have graph entropy greater than that of a complete bipartite graph, which is bounded by 1. Thus, by sub-additivity, the union of k bipartite graphs cannot have entropy greater than k. Now let G = (V, E) be a complete graph on n vertices. By the properties listed above, H(G) = \log_2 n. Therefore, the union of fewer than \log_2 n bipartite graphs cannot have the same entropy as G, so G cannot be expressed as such a union. \blacksquare


General References


Notes
Page 1 of 1
1
Page 1 of 1
1

Account

Social:
Pages:  ..   .. 
Items:  .. 

Navigation

General: Atom Feed Atom Feed  .. 
Help:  ..   .. 
Category:  ..   .. 
Media:  ..   .. 
Posts:  ..   ..   .. 

Statistics

Page:  .. 
Summary:  .. 
1 Tags
10/10 Page Rank
5 Page Refs