Commit 5d76bbb1 authored by Tiago Peixoto's avatar Tiago Peixoto

Assorted fixes to the documentation

parent b487971b
......@@ -40,7 +40,7 @@ templates_path = ['.templates']
source_suffix = '.rst'
# The encoding of source files.
#source_encoding = 'utf-8'
source_encoding = 'utf-8'
# The master toctree document.
master_doc = 'index'
......
......@@ -15,6 +15,7 @@
.. autofunction:: graph_draw
.. autofunction:: graphviz_draw
.. autofunction:: prop_to_size
Low-level graph drawing
......
......@@ -187,6 +187,8 @@
.. autofunction:: load_graph
.. autofunction:: group_vector_property
.. autofunction:: ungroup_vector_property
.. autofunction:: infect_vertex_property
.. autofunction:: edge_difference
.. autofunction:: value_types
.. autofunction:: show_config
......
......@@ -33,7 +33,7 @@ v_age[v] = 0
vlist = [v]
# let's now add the new edges and vertices
for i in xrange(1, N):
for i in range(1, N):
# create our new vertex
v = g.add_vertex()
v_age[v] = i
......@@ -58,11 +58,11 @@ for i in xrange(1, N):
v = g.vertex(randint(0, g.num_vertices()))
while True:
print "vertex:", v, "in-degree:", v.in_degree(), "out-degree:",\
v.out_degree(), "age:", v_age[v]
print("vertex:", v, "in-degree:", v.in_degree(), "out-degree:",\
v.out_degree(), "age:", v_age[v])
if v.out_degree() == 0:
print "Nowhere else to go... We found the main hub!"
print("Nowhere else to go... We found the main hub!")
break
n_list = []
......
......@@ -10,6 +10,11 @@ The module must be of course imported before it can be used. The package is
subdivided into several sub-modules. To import everything from all of them, one
can do:
.. testsetup::
np.random.seed(42)
gt.seed_rng(42)
.. doctest::
>>> from graph_tool.all import *
......@@ -87,6 +92,13 @@ visualize the graph we created so far with the
... output_size=(200, 200), output="two-nodes.pdf")
<...>
.. doctest::
:hide:
graph_draw(g, vertex_text=g.vertex_index, vertex_font_size=18,
output_size=(200, 200), output="two-nodes.png")
.. figure:: two-nodes.*
:align: center
......@@ -410,19 +422,20 @@ Graph I/O
Graphs can be saved and loaded in three formats: `graphml
<http://graphml.graphdrawing.org/>`_, `dot
<http://www.graphviz.org/doc/info/lang.html>`_ and
`gml <http://www.fim.uni-passau.de/en/fim/faculty/chairs/theoretische-informatik/projects.html>`_.
<http://www.graphviz.org/doc/info/lang.html>`_ and `gml
<http://www.fim.uni-passau.de/en/fim/faculty/chairs/theoretische-informatik/projects.html>`_.
``Graphml`` is the default and preferred format, since it is by far the
most complete. The ``dot`` and ``gml`` formats are fully supported, but
since they contain no precise type information, all properties are read
as strings (or also as double, in the case of ``gml``), and must be
converted per hand. Therefore you should always use graphml, except when
interfacing with other software, or existing data, which uses ``dot`` or
``gml``.
converted by hand. Therefore you should always use graphml, since it
performs an exact bit-for-bit representation of all supported
:ref:`sec_property_maps`, except when interfacing with other software, or
existing data, which uses ``dot`` or ``gml``.
A graph can be saved or loaded to a file with the :attr:`~graph_tool.Graph.save`
and :attr:`~graph_tool.Graph.load` methods, which take either a file name or a
file-like object. A graph can also be loaded from disk with the
file-like object. A graph can also be loaded from disc with the
:func:`~graph_tool.load_graph` function, as such:
.. doctest::
......@@ -439,12 +452,21 @@ Graph classes can also be pickled with the :mod:`pickle` module.
An Example: Building a Price Network
------------------------------------
A Price network is the first known model of a "scale-free" graph, invented in
1976 by `de Solla Price
A Price network is the first known model of a "scale-free" graph,
invented in 1976 by `de Solla Price
<http://en.wikipedia.org/wiki/Derek_J._de_Solla_Price>`_. It is defined
dynamically, where at each time step a new vertex is added to the graph, and
connected to an old vertex, with probability proportional to its in-degree. The
following program implements this construction using ``graph-tool``.
dynamically, where at each time step a new vertex is added to the graph,
and connected to an old vertex, with probability proportional to its
in-degree. The following program implements this construction using
``graph-tool``.
.. note::
Note that it would be much faster just to use the
:func:`~graph_tool.generation.price_network` function, which is
implemented in C++, as opposed to the script below which is in pure
python. The code below is merely a demonstration on how to use the
library.
.. literalinclude:: price.py
:linenos:
......@@ -488,8 +510,9 @@ use the :func:`~graph_tool.draw.graph_draw` function.
g = load_graph("price.xml.gz")
age = g.vertex_properties["age"]
graph_draw(g, output_size=(1000, 1000), vertex_color=age,
vertex_fill_color=age, vertex_size=2.5, edge_pen_width=1.5,
vertex_fill_color=age, vertex_size=1, edge_pen_width=1.2,
output="price.png")
.. figure:: price.*
......@@ -535,8 +558,14 @@ edge filtering.
g, pos = triangulation(random((500, 2)) * 4, type="delaunay")
tree = min_spanning_tree(g)
graph_draw(g, pos=pos, edge_color=tree, output="min_tree.pdf")
.. testcode::
:hide:
graph_draw(g, pos=pos, edge_color=tree, output_size=(400, 400),
output="min_tree.pdf")
output="min_tree.png")
The ``tree`` property map has a bool type, with value "1" if the edge belongs to
the tree, and "0" otherwise. Below is an image of the original graph, with the
......@@ -549,8 +578,13 @@ We can now filter out the edges which don't belong to the minimum spanning tree.
.. testcode::
g.set_edge_filter(tree)
graph_draw(g, pos=pos, output_size=(400, 400), output="min_tree_filtered.pdf")
g.set_edge_filter(tree)
graph_draw(g, pos=pos, output="min_tree_filtered.pdf")
.. testcode::
:hide:
graph_draw(g, pos=pos, output_size=(400, 400), output="min_tree_filtered.png")
This is how the graph looks when filtered:
......@@ -567,7 +601,13 @@ and draws them as colors and line thickness in the graph.
bv, be = betweenness(g)
be.a /= be.a.max() / 5
graph_draw(g, pos=pos, vertex_fill_color=bv, edge_pen_width=be,
output_size=(400, 400), output="filtered-bt.pdf")
output="filtered-bt.pdf")
.. testcode::
:hide:
graph_draw(g, pos=pos, vertex_fill_color=bv, edge_pen_width=be,
output_size=(400, 400), output="filtered-bt.png")
.. figure:: filtered-bt.*
:align: center
......@@ -581,7 +621,13 @@ The original graph can be recovered by setting the edge filter to ``None``.
bv, be = betweenness(g)
be.a /= be.a.max() / 5
graph_draw(g, pos=pos, vertex_fill_color=bv, edge_pen_width=be,
output_size=(400, 400), output="nonfiltered-bt.pdf")
output="nonfiltered-bt.pdf")
.. testcode::
:hide:
graph_draw(g, pos=pos, vertex_fill_color=bv, edge_pen_width=be,
output_size=(400, 400), output="nonfiltered-bt.png")
.. figure:: nonfiltered-bt.*
:align: center
......@@ -637,10 +683,15 @@ Like above, the result should be the isolated minimum spanning tree:
>>> bv, be = betweenness(tv)
>>> be.a /= be.a.max() / 5
>>> graph_draw(tv, pos=pos, vertex_fill_color=bv,
... edge_pen_width=be, output_size=(400, 400),
... output="mst-view.pdf")
... edge_pen_width=be, output="mst-view.pdf")
<...>
.. testcode::
:hide:
graph_draw(tv, pos=pos, vertex_fill_color=bv,
edge_pen_width=be, output_size=(400, 400),
output="mst-view.png")
.. figure:: mst-view.*
:align: center
......@@ -683,10 +734,16 @@ The graph view constructed above can be visualized as
.. doctest::
>>> be.a /= be.a.max() / 5
>>> graph_draw(u, pos=pos, vertex_fill_color=bv, output_size=(400, 400),
... output="central-edges-view.pdf")
>>> graph_draw(u, pos=pos, vertex_fill_color=bv, output="central-edges-view.pdf")
<...>
.. testcode::
:hide:
graph_draw(u, pos=pos, vertex_fill_color=bv, output_size=(400, 400),
output="central-edges-view.png")
.. figure:: central-edges-view.*
:align: center
......@@ -711,9 +768,14 @@ The resulting graph view can be visualized as
.. doctest::
>>> graph_draw(u, pos=pos, output_size=(400, 400), output="composed-filter.pdf")
>>> graph_draw(u, pos=pos, output="composed-filter.pdf")
<...>
.. testcode::
:hide:
graph_draw(u, pos=pos, output_size=(400, 400), output="composed-filter.png")
.. figure:: composed-filter.*
:align: center
......
......@@ -37,6 +37,8 @@ Summary
load_graph
group_vector_property
ungroup_vector_property
infect_vertex_property
edge_difference
value_types
show_config
......@@ -797,7 +799,7 @@ def infect_vertex_property(g, prop, vals=None):
Returns
-------
None
None : ``None``
Examples
--------
......@@ -822,8 +824,7 @@ def edge_difference(g, prop, ediff=None):
prop : :class:`~graph_tool.PropertyMap`
Vertex property map to be used to compute the difference..
ediff : :class:`~graph_tool.PropertyMap` (optional, default: `None`)
If not provided, the difference values will be stored in this property
map.
If provided, the difference values will be stored in this property map.
Returns
-------
......
This diff is collapsed.
......@@ -112,12 +112,16 @@ def local_clustering(g, prop=None, undirected=True):
Examples
--------
>>> from numpy.random import seed
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
>>> g = gt.random_graph(1000, lambda: (5,5))
>>> clust = gt.local_clustering(g)
>>> print(gt.vertex_average(g, clust))
(0.00908888888888889, 0.0004449824521439575)
(0.008622222222222222, 0.00043812507374825467)
References
----------
......@@ -170,11 +174,15 @@ def global_clustering(g):
Examples
--------
>>> from numpy.random import seed
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
>>> g = gt.random_graph(1000, lambda: (5,5))
>>> print(gt.global_clustering(g))
(0.009114059777509717, 0.0004464454368899158)
(0.008641479099678457, 0.00043945639266115854)
References
----------
......@@ -242,18 +250,22 @@ def extended_clustering(g, props=None, max_depth=3, undirected=False):
Examples
--------
>>> from numpy.random import seed
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
>>> g = gt.random_graph(1000, lambda: (5,5))
>>> clusts = gt.extended_clustering(g, max_depth=5)
>>> for i in range(0, 5):
... print(gt.vertex_average(g, clusts[i]))
...
(0.0058850000000000005, 0.0004726257592782405)
(0.026346666666666668, 0.0009562588213100747)
(0.11638833333333333, 0.002086419787711849)
(0.3862533333333333, 0.003020064612995335)
(0.44685499999999995, 0.003124572962377774)
(0.005646666666666667, 0.000485653786148116)
(0.023786666666666668, 0.000912204314345811)
(0.11883, 0.0019560612322840113)
(0.4067333333333333, 0.0031162452478013408)
(0.4260333333333333, 0.003097392092999815)
References
----------
......@@ -319,14 +331,18 @@ def motifs(g, k, p=1.0, motif_list=None):
Examples
--------
>>> from numpy.random import seed
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
>>> g = gt.random_graph(1000, lambda: (5,5))
>>> motifs, counts = gt.motifs(gt.GraphView(g, directed=False), 4)
>>> print(len(motifs))
11
10
>>> print(counts)
[115104, 389090, 724, 820, 1828, 3208, 791, 4, 12, 12, 3]
[115392, 389974, 668, 761, 3056, 1698, 770, 4, 10, 7]
References
......@@ -375,15 +391,13 @@ def motifs(g, k, p=1.0, motif_list=None):
list_hist = list(zip(sub_list, hist))
# sort according to in-degree sequence
list_hist.sort(lambda x, y: cmp(sorted([v.in_degree() for v in x[0].vertices()]),
sorted([v.in_degree() for v in y[0].vertices()])))
list_hist.sort(key=lambda x: sorted([v.in_degree() for v in x[0].vertices()]))
# sort according to out-degree sequence
list_hist.sort(lambda x, y: cmp(sorted([v.out_degree() for v in x[0].vertices()]),
sorted([v.out_degree() for v in y[0].vertices()])))
list_hist.sort(key=lambda x: sorted([v.out_degree() for v in x[0].vertices()]))
# sort according to ascending number of edges
list_hist.sort(lambda x, y: cmp(x[0].num_edges(), y[0].num_edges()))
list_hist.sort(key=lambda x: x[0].num_edges())
sub_list = [x[0] for x in list_hist]
hist = [x[1] for x in list_hist]
......@@ -487,9 +501,10 @@ def motif_significance(g, k, n_shuffles=100, p=1.0, motif_list=None,
>>> g = gt.random_graph(100, lambda: (3,3))
>>> motifs, zscores = gt.motif_significance(g, 3)
>>> print(len(motifs))
11
12
>>> print(zscores)
[0.014875553792545083, 0.016154998074953769, 0.002455801898331304, -1.9579019397305546, 0.83542298414538518, 0.84715258999068244, -0.93385230436820643, -0.11, -0.1, -0.31, -0.14]
[-0.26668839861225141, -0.37627454937420612, 0.63246648717833609, 1.9856284046854835, -0.60389512777130483, -0.35502673716019983, -1.2236332765203428, 0.87, -0.11, -0.47, -0.2, -0.01]
"""
s_ms, counts = motifs(g, k, p, motif_list)
......@@ -534,19 +549,13 @@ def motif_significance(g, k, n_shuffles=100, p=1.0, motif_list=None,
list_hist = list(zip(s_ms, s_counts, s_dev))
# sort according to in-degree sequence
list_hist.sort(lambda x, y: cmp(sorted([v.in_degree()\
for v in x[0].vertices()]),
sorted([v.in_degree()\
for v in y[0].vertices()])))
list_hist.sort(key = lambda x: sorted([v.in_degree() for v in x[0].vertices()])),
# sort according to out-degree sequence
list_hist.sort(lambda x, y: cmp(sorted([v.out_degree()\
for v in x[0].vertices()]),
sorted([v.out_degree()\
for v in y[0].vertices()])))
list_hist.sort(key = lambda x: sorted([v.out_degree() for v in x[0].vertices()]))
# sort according to ascending number of edges
list_hist.sort(lambda x, y: cmp(x[0].num_edges(), y[0].num_edges()))
list_hist.sort(key = lambda x: x[0].num_edges())
s_ms, s_counts, s_dev = list(zip(*list_hist))
......@@ -556,4 +565,3 @@ def motif_significance(g, k, n_shuffles=100, p=1.0, motif_list=None,
return s_ms, zscore, counts, s_counts, s_dev
else:
return s_ms, zscore
......@@ -823,7 +823,7 @@ def mcmc_sweep(state, beta=1., sequential=True, verbose=False, vertices=None):
Carnegie-Mellon University, Pittsburgh, PA 15213, U.S.A., :doi:`10.1016/0378-8733(83)90021-7`
.. [faust-blockmodels-1992] Katherine Faust, and Stanley
Wasserman. "Blockmodels: Interpretation and Evaluation." Social Networks
14, no. 1–2 (1992): 5–61. :doi:`10.1016/0378-8733(92)90013-W`
14, no. 1-2 (1992): 5-61. :doi:`10.1016/0378-8733(92)90013-W`
.. [karrer-stochastic-2011] Brian Karrer, and M. E. J. Newman. "Stochastic
Blockmodels and Community Structure in Networks." Physical Review E 83,
no. 1 (2011): 016107. :doi:`10.1103/PhysRevE.83.016107`.
......@@ -1146,7 +1146,7 @@ def minimize_blockmodel_dl(g, deg_corr=True, nsweeps=100, adaptive_convergence=T
Carnegie-Mellon University, Pittsburgh, PA 15213, U.S.A., :doi:`10.1016/0378-8733(83)90021-7`
.. [faust-blockmodels-1992] Katherine Faust, and Stanley
Wasserman. "Blockmodels: Interpretation and Evaluation." Social Networks
14, no. 1–2 (1992): 5–61. :doi:`10.1016/0378-8733(92)90013-W`
14, no. 1-2 (1992): 5-61. :doi:`10.1016/0378-8733(92)90013-W`
.. [karrer-stochastic-2011] Brian Karrer, and M. E. J. Newman. "Stochastic
Blockmodels and Community Structure in Networks." Physical Review E 83,
no. 1 (2011): 016107. :doi:`10.1103/PhysRevE.83.016107`.
......
......@@ -97,12 +97,17 @@ def assortativity(g, deg):
Examples
--------
>>> from numpy.random import randint, random, seed
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
from pylab import *
>>> def sample_k(max):
... accept = False
... while not accept:
... k = randint(1,max+1)
... k = np.random.randint(1,max+1)
... accept = random() < 1.0/k
... return k
...
......@@ -110,7 +115,7 @@ def assortativity(g, deg):
... lambda i,k: 1.0 / (1 + abs(i - k)), directed=False,
... mix_time=100)
>>> gt.assortativity(g, "out")
(0.14145218664992676, 0.005077209994557802)
(0.14282704866231305, 0.005109451062660124)
References
----------
......@@ -168,25 +173,30 @@ def scalar_assortativity(g, deg):
Examples
--------
>>> from numpy.random import randint, random, seed
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
from pylab import *
>>> def sample_k(max):
... accept = False
... while not accept:
... k = randint(1,max+1)
... k = np.random.randint(1,max+1)
... accept = random() < 1.0/k
... return k
...
>>> g = gt.random_graph(1000, lambda: sample_k(40), lambda i,k: abs(i-k),
... directed=False, mix_time=100)
>>> gt.scalar_assortativity(g, "out")
(-0.46972665544654923, 0.010035656615797507)
(-0.43719843848745943, 0.010593923895499584)
>>> g = gt.random_graph(1000, lambda: sample_k(40),
... lambda i, k: 1.0 / (1 + abs(i - k)),
... directed=False, mix_time=100)
>>> gt.scalar_assortativity(g, "out")
(0.6120658464996896, 0.011388445161055338)
(0.6018887530895891, 0.011474042583027698)
References
----------
......@@ -255,13 +265,17 @@ def corr_hist(g, deg_source, deg_target, bins=[[0, 1], [0, 1]], weight=None,
Examples
--------
>>> from numpy.random import randint, random, seed
>>> from pylab import *
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
from pylab import *
>>> def sample_k(max):
... accept = False
... while not accept:
... k = randint(1,max+1)
... k = np.random.randint(1,max+1)
... accept = random() < 1.0/k
... return k
...
......@@ -280,6 +294,11 @@ def corr_hist(g, deg_source, deg_target, bins=[[0, 1], [0, 1]], weight=None,
<...>
>>> savefig("corr.pdf")
.. testcode::
:hide:
savefig("corr.png")
.. figure:: corr.*
:align: center
......@@ -342,9 +361,13 @@ def combined_corr_hist(g, deg1, deg2, bins=[[0, 1], [0, 1]], float_count=True):
Examples
--------
>>> from numpy.random import randint, random, seed
>>> from pylab import *
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
from pylab import *
>>> def sample_k(max):
... accept = False
... while not accept:
......@@ -366,6 +389,11 @@ def combined_corr_hist(g, deg1, deg2, bins=[[0, 1], [0, 1]], float_count=True):
<...>
>>> savefig("combined_corr.pdf")
.. testcode::
:hide:
savefig("combined_corr.pdf")
.. figure:: combined_corr.*
:align: center
......@@ -433,9 +461,13 @@ def avg_neighbour_corr(g, deg_source, deg_target, bins=[0, 1], weight=None):
Examples
--------
>>> from numpy.random import randint, random, seed
>>> from pylab import *
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
from pylab import *
>>> def sample_k(max):
... accept = False
... while not accept:
......@@ -456,6 +488,11 @@ def avg_neighbour_corr(g, deg_source, deg_target, bins=[0, 1], weight=None):
<...>
>>> savefig("avg_corr.pdf")
.. testcode::
:hide:
savefig("avg_corr.png")
.. figure:: avg_corr.*
:align: center
......@@ -511,9 +548,13 @@ def avg_combined_corr(g, deg1, deg2, bins=[0, 1]):
Examples
--------
>>> from numpy.random import randint, random, seed
>>> from pylab import *
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
from pylab import *
>>> def sample_k(max):
... accept = False
... while not accept:
......@@ -533,6 +574,11 @@ def avg_combined_corr(g, deg1, deg2, bins=[0, 1]):
<...>
>>> savefig("combined_avg_corr.pdf")
.. testcode::
:hide:
savefig("combined_avg_corr.png")
.. figure:: combined_avg_corr.*
:align: center
......
......@@ -114,13 +114,17 @@ def random_layout(g, shape=None, pos=None, dim=2):
Examples
--------
>>> from numpy.random import seed
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
>>> g = gt.random_graph(100, lambda: (3, 3))
>>> shape = [[50, 100], [1, 2], 4]
>>> pos = gt.random_layout(g, shape=shape, dim=3)
>>> pos[g.vertex(0)].a
array([ 86.59969709, 1.31435598, 0.64651486])
array([ 68.72700594, 1.03142919, 2.56812658])
"""
......@@ -197,13 +201,22 @@ def fruchterman_reingold_layout(g, weight=None, a=None, r=1., scale=None,
Examples
--------
>>> from numpy.random import seed, zipf
>>> seed(42)
.. testcode::
:hide:
np.random.seed(42)
gt.seed_rng(42)
>>> g = gt.price_network(300)
>>> pos = gt.fruchterman_reingold_layout(g, n_iter=1000)
>>> gt.graph_draw(g, pos=pos, output="graph-draw-fr.pdf")
<...>