Commit ab4b82b5 by Tiago Peixoto

### Update docstring tests

parent e39910a5
Pipeline #449 failed with stage
in 820 minutes and 26 seconds
 ... ... @@ -245,9 +245,9 @@ Therefore, we can compute the posterior odds ratio between both models as: .. testoutput:: food-web :options: +NORMALIZE_WHITESPACE ln Λ: -70.145685... ln Λ: -24.246389... A value of :math:\Lambda \approx \mathrm{e}^{-70} \approx 10^{-30} in A value of :math:\Lambda \approx \mathrm{e}^{-24} \approx 10^{-10} in favor the exponential model indicates that the log-normal model does not provide a better fit for this particular data. Based on this, we conclude that the exponential model should be preferred in this case. ... ...
 ... ... @@ -134,7 +134,7 @@ illustrate its use with the neural network of the C. elegans .. testsetup:: celegans gt.seed_rng(47) gt.seed_rng(51) .. testcode:: celegans ... ... @@ -186,9 +186,9 @@ which shows the number of nodes and groups in all levels: .. testoutput:: celegans l: 0, N: 297, B: 17 l: 1, N: 17, B: 9 l: 2, N: 9, B: 3 l: 0, N: 297, B: 16 l: 1, N: 16, B: 8 l: 2, N: 8, B: 3 l: 3, N: 3, B: 1 The hierarchical levels themselves are represented by individual ... ... @@ -203,10 +203,10 @@ The hierarchical levels themselves are represented by individual .. testoutput:: celegans , at 0x...> , at 0x...> , at 0x...> , at 0x...> , at 0x...> , at 0x...> , at 0x...> , at 0x...> This means that we can inspect the hierarchical partition just as before: ... ... @@ -221,6 +221,6 @@ This means that we can inspect the hierarchical partition just as before: .. testoutput:: celegans 7 0 2 1 0
 ... ... @@ -154,8 +154,8 @@ evidence efficiently, as we show below, using .. testoutput:: model-evidence Model evidence for deg_corr = True: -569.590426... (mean field), -817.788531... (Bethe) Model evidence for deg_corr = False: -587.028530... (mean field), -736.990655... (Bethe) Model evidence for deg_corr = True: -579.300446... (mean field), -832.245049... (Bethe) Model evidence for deg_corr = False: -586.652245... (mean field), -737.721423... (Bethe) If we consider the more accurate approximation, the outcome shows a preference for the non-degree-corrected model. ... ... @@ -219,8 +219,8 @@ approach for the same network, using the nested model. .. testoutput:: model-evidence Model evidence for deg_corr = True: -551.228195... (mean field), -740.460493... (Bethe) Model evidence for deg_corr = False: -544.660366... (mean field), -649.135026... (Bethe) Model evidence for deg_corr = True: -555.768070... (mean field), -731.501041... (Bethe) Model evidence for deg_corr = False: -544.346500... (mean field), -630.951518... (Bethe) The results are similar: If we consider the most accurate approximation, the non-degree-corrected model possesses the largest evidence. Note also ... ...
 ... ... @@ -25,8 +25,8 @@ we have .. testoutput:: model-selection :options: +NORMALIZE_WHITESPACE Non-degree-corrected DL: 8456.994339... Degree-corrected DL: 8233.850036... Non-degree-corrected DL: 8524.911216... Degree-corrected DL: 8274.075603... Since it yields the smallest description length, the degree-corrected fit should be preferred. The statistical significance of the choice can ... ... @@ -52,12 +52,12 @@ fits. In our particular case, we have .. testoutput:: model-selection :options: +NORMALIZE_WHITESPACE ln Λ: -223.144303... ln Λ: -250.835612... The precise threshold that should be used to decide when to reject a hypothesis _ is subjective and context-dependent, but the value above implies that the particular degree-corrected fit is around :math:\mathrm{e}^{233} \approx 10^{96} particular degree-corrected fit is around :math:\mathrm{e}^{251} \approx 10^{109} times more likely than the non-degree corrected one, and hence it can be safely concluded that it provides a substantially better fit. ... ... @@ -79,11 +79,11 @@ example, for the American football network above, we have: .. testoutput:: model-selection :options: +NORMALIZE_WHITESPACE Non-degree-corrected DL: 1734.814739... Non-degree-corrected DL: 1733.525685... Degree-corrected DL: 1780.576716... ln Λ: -45.761977... ln Λ: -47.051031... Hence, with a posterior odds ratio of :math:\Lambda \approx \mathrm{e}^{-45} \approx 10^{-19} in favor of the non-degree-corrected model, it seems like the Hence, with a posterior odds ratio of :math:\Lambda \approx \mathrm{e}^{-47} \approx 10^{-20} in favor of the non-degree-corrected model, we conclude that the degree-corrected variant is an unnecessarily complex description for this network.
 ... ... @@ -185,9 +185,9 @@ Which yields the following output: .. testoutput:: measured Posterior probability of edge (11, 36): 0.801980... Posterior probability of non-edge (15, 73): 0.097309... Estimated average local clustering: 0.572154 ± 0.004853... Posterior probability of edge (11, 36): 0.812881... Posterior probability of non-edge (15, 73): 0.160516... Estimated average local clustering: 0.57309 ± 0.005985... We have a successful reconstruction, where both ambiguous adjacency matrix entries are correctly recovered. The value for the average ... ... @@ -306,9 +306,9 @@ Which yields: .. testoutput:: measured Posterior probability of edge (11, 36): 0.790179... Posterior probability of non-edge (15, 73): 0.109010... Estimated average local clustering: 0.572504 ± 0.005453... Posterior probability of edge (11, 36): 0.693369... Posterior probability of non-edge (15, 73): 0.170517... Estimated average local clustering: 0.570545 ± 0.006892... The results are very similar to the ones obtained with the uniform model in this case, but can be quite different in situations where a large ... ... @@ -434,9 +434,9 @@ The above yields the output: .. testoutput:: uncertain Posterior probability of edge (11, 36): 0.950495... Posterior probability of non-edge (15, 73): 0.067406... Estimated average local clustering: 0.552333 ± 0.019183... Posterior probability of edge (11, 36): 0.881188... Posterior probability of non-edge (15, 73): 0.043004... Estimated average local clustering: 0.557825 ± 0.014038... The reconstruction is accurate, despite the two ambiguous entries having the same measurement probability. The reconstructed network is visualized below. ... ...
 ... ... @@ -54,8 +54,8 @@ random partition into 20 groups .. testoutput:: model-averaging Change in description length: -365.317522... Number of accepted vertex moves: 38213 Change in description length: -353.848032... Number of accepted vertex moves: 37490 .. note:: ... ... @@ -78,8 +78,8 @@ random partition into 20 groups .. testoutput:: model-averaging Change in description length: 1.660677... Number of accepted vertex moves: 40461 Change in description length: 31.622518... Number of accepted vertex moves: 43152 Although the above is sufficient to implement model averaging, there is a convenience function called ... ... @@ -100,51 +100,51 @@ will output: .. testoutput:: model-averaging :options: +NORMALIZE_WHITESPACE niter: 1 count: 0 breaks: 0 min_S: 706.26857 max_S: 708.14483 S: 708.14483 ΔS: 1.87626 moves: 418 niter: 2 count: 0 breaks: 0 min_S: 699.23453 max_S: 708.14483 S: 699.23453 ΔS: -8.91030 moves: 409 niter: 3 count: 0 breaks: 0 min_S: 699.23453 max_S: 715.33531 S: 715.33531 ΔS: 16.1008 moves: 414 niter: 4 count: 0 breaks: 0 min_S: 699.23453 max_S: 723.13301 S: 723.13301 ΔS: 7.79770 moves: 391 niter: 5 count: 1 breaks: 0 min_S: 699.23453 max_S: 723.13301 S: 702.93354 ΔS: -20.1995 moves: 411 niter: 6 count: 2 breaks: 0 min_S: 699.23453 max_S: 723.13301 S: 706.39029 ΔS: 3.45675 moves: 389 niter: 7 count: 3 breaks: 0 min_S: 699.23453 max_S: 723.13301 S: 706.80859 ΔS: 0.418293 moves: 404 niter: 8 count: 4 breaks: 0 min_S: 699.23453 max_S: 723.13301 S: 707.61960 ΔS: 0.811010 moves: 417 niter: 9 count: 5 breaks: 0 min_S: 699.23453 max_S: 723.13301 S: 706.46577 ΔS: -1.15383 moves: 392 niter: 10 count: 6 breaks: 0 min_S: 699.23453 max_S: 723.13301 S: 714.34671 ΔS: 7.88094 moves: 410 niter: 11 count: 7 breaks: 0 min_S: 699.23453 max_S: 723.13301 S: 706.43194 ΔS: -7.91477 moves: 383 niter: 12 count: 8 breaks: 0 min_S: 699.23453 max_S: 723.13301 S: 705.19434 ΔS: -1.23760 moves: 405 niter: 13 count: 9 breaks: 0 min_S: 699.23453 max_S: 723.13301 S: 702.21395 ΔS: -2.98039 moves: 423 niter: 14 count: 0 breaks: 1 min_S: 715.54878 max_S: 715.54878 S: 715.54878 ΔS: 13.3348 moves: 400 niter: 15 count: 0 breaks: 1 min_S: 715.54878 max_S: 716.65842 S: 716.65842 ΔS: 1.10964 moves: 413 niter: 16 count: 0 breaks: 1 min_S: 701.19994 max_S: 716.65842 S: 701.19994 ΔS: -15.4585 moves: 382 niter: 17 count: 1 breaks: 1 min_S: 701.19994 max_S: 716.65842 S: 715.56997 ΔS: 14.3700 moves: 394 niter: 18 count: 0 breaks: 1 min_S: 701.19994 max_S: 719.25577 S: 719.25577 ΔS: 3.68580 moves: 404 niter: 19 count: 0 breaks: 1 min_S: 701.19994 max_S: 723.78811 S: 723.78811 ΔS: 4.53233 moves: 413 niter: 20 count: 1 breaks: 1 min_S: 701.19994 max_S: 723.78811 S: 709.77340 ΔS: -14.0147 moves: 387 niter: 21 count: 2 breaks: 1 min_S: 701.19994 max_S: 723.78811 S: 714.14891 ΔS: 4.37551 moves: 419 niter: 22 count: 3 breaks: 1 min_S: 701.19994 max_S: 723.78811 S: 722.05875 ΔS: 7.90984 moves: 399 niter: 23 count: 4 breaks: 1 min_S: 701.19994 max_S: 723.78811 S: 714.32503 ΔS: -7.73371 moves: 422 niter: 24 count: 5 breaks: 1 min_S: 701.19994 max_S: 723.78811 S: 708.53927 ΔS: -5.78576 moves: 392 niter: 25 count: 6 breaks: 1 min_S: 701.19994 max_S: 723.78811 S: 714.05889 ΔS: 5.51962 moves: 404 niter: 26 count: 7 breaks: 1 min_S: 701.19994 max_S: 723.78811 S: 713.93196 ΔS: -0.126937 moves: 414 niter: 27 count: 8 breaks: 1 min_S: 701.19994 max_S: 723.78811 S: 709.49863 ΔS: -4.43333 moves: 410 niter: 28 count: 9 breaks: 1 min_S: 701.19994 max_S: 723.78811 S: 707.42167 ΔS: -2.07696 moves: 397 niter: 29 count: 0 breaks: 1 min_S: 699.89982 max_S: 723.78811 S: 699.89982 ΔS: -7.52185 moves: 388 niter: 30 count: 0 breaks: 1 min_S: 698.57305 max_S: 723.78811 S: 698.57305 ΔS: -1.32677 moves: 391 niter: 31 count: 1 breaks: 1 min_S: 698.57305 max_S: 723.78811 S: 706.02629 ΔS: 7.45324 moves: 412 niter: 32 count: 2 breaks: 1 min_S: 698.57305 max_S: 723.78811 S: 701.97778 ΔS: -4.04852 moves: 421 niter: 33 count: 3 breaks: 1 min_S: 698.57305 max_S: 723.78811 S: 707.50134 ΔS: 5.52356 moves: 410 niter: 34 count: 4 breaks: 1 min_S: 698.57305 max_S: 723.78811 S: 708.56686 ΔS: 1.06552 moves: 424 niter: 35 count: 0 breaks: 1 min_S: 698.57305 max_S: 724.07361 S: 724.07361 ΔS: 15.5067 moves: 399 niter: 36 count: 1 breaks: 1 min_S: 698.57305 max_S: 724.07361 S: 723.51969 ΔS: -0.553915 moves: 384 niter: 37 count: 2 breaks: 1 min_S: 698.57305 max_S: 724.07361 S: 702.36708 ΔS: -21.1526 moves: 406 niter: 38 count: 3 breaks: 1 min_S: 698.57305 max_S: 724.07361 S: 707.60129 ΔS: 5.23420 moves: 405 niter: 39 count: 4 breaks: 1 min_S: 698.57305 max_S: 724.07361 S: 709.67542 ΔS: 2.07413 moves: 400 niter: 40 count: 5 breaks: 1 min_S: 698.57305 max_S: 724.07361 S: 714.52753 ΔS: 4.85212 moves: 398 niter: 41 count: 6 breaks: 1 min_S: 698.57305 max_S: 724.07361 S: 707.86563 ΔS: -6.66190 moves: 409 niter: 42 count: 7 breaks: 1 min_S: 698.57305 max_S: 724.07361 S: 718.80926 ΔS: 10.9436 moves: 400 niter: 43 count: 8 breaks: 1 min_S: 698.57305 max_S: 724.07361 S: 716.37312 ΔS: -2.43615 moves: 378 niter: 44 count: 9 breaks: 1 min_S: 698.57305 max_S: 724.07361 S: 713.76944 ΔS: -2.60368 moves: 399 niter: 45 count: 10 breaks: 2 min_S: 698.57305 max_S: 724.07361 S: 715.29009 ΔS: 1.52066 moves: 421 niter: 1 count: 0 breaks: 0 min_S: 703.94152 max_S: 730.97213 S: 703.94152 ΔS: -27.0306 moves: 431 niter: 2 count: 1 breaks: 0 min_S: 703.94152 max_S: 730.97213 S: 708.61840 ΔS: 4.67688 moves: 413 niter: 3 count: 2 breaks: 0 min_S: 703.94152 max_S: 730.97213 S: 704.60994 ΔS: -4.00847 moves: 416 niter: 4 count: 0 breaks: 0 min_S: 700.85336 max_S: 730.97213 S: 700.85336 ΔS: -3.75658 moves: 391 niter: 5 count: 1 breaks: 0 min_S: 700.85336 max_S: 730.97213 S: 713.22553 ΔS: 12.3722 moves: 387 niter: 6 count: 2 breaks: 0 min_S: 700.85336 max_S: 730.97213 S: 703.57357 ΔS: -9.65196 moves: 434 niter: 7 count: 3 breaks: 0 min_S: 700.85336 max_S: 730.97213 S: 715.02440 ΔS: 11.4508 moves: 439 niter: 8 count: 0 breaks: 0 min_S: 700.68857 max_S: 730.97213 S: 700.68857 ΔS: -14.3358 moves: 427 niter: 9 count: 1 breaks: 0 min_S: 700.68857 max_S: 730.97213 S: 717.95725 ΔS: 17.2687 moves: 409 niter: 10 count: 2 breaks: 0 min_S: 700.68857 max_S: 730.97213 S: 720.02079 ΔS: 2.06354 moves: 435 niter: 11 count: 3 breaks: 0 min_S: 700.68857 max_S: 730.97213 S: 718.15880 ΔS: -1.86199 moves: 399 niter: 12 count: 4 breaks: 0 min_S: 700.68857 max_S: 730.97213 S: 708.06732 ΔS: -10.0915 moves: 436 niter: 13 count: 5 breaks: 0 min_S: 700.68857 max_S: 730.97213 S: 712.76007 ΔS: 4.69274 moves: 432 niter: 14 count: 6 breaks: 0 min_S: 700.68857 max_S: 730.97213 S: 705.60582 ΔS: -7.15425 moves: 409 niter: 15 count: 7 breaks: 0 min_S: 700.68857 max_S: 730.97213 S: 704.37333 ΔS: -1.23249 moves: 434 niter: 16 count: 8 breaks: 0 min_S: 700.68857 max_S: 730.97213 S: 717.54492 ΔS: 13.1716 moves: 426 niter: 17 count: 9 breaks: 0 min_S: 700.68857 max_S: 730.97213 S: 715.05767 ΔS: -2.48725 moves: 449 niter: 18 count: 0 breaks: 1 min_S: 715.77940 max_S: 715.77940 S: 715.77940 ΔS: 0.721731 moves: 448 niter: 19 count: 0 breaks: 1 min_S: 708.38072 max_S: 715.77940 S: 708.38072 ΔS: -7.39868 moves: 447 niter: 20 count: 0 breaks: 1 min_S: 705.63447 max_S: 715.77940 S: 705.63447 ΔS: -2.74625 moves: 441 niter: 21 count: 1 breaks: 1 min_S: 705.63447 max_S: 715.77940 S: 707.01766 ΔS: 1.38319 moves: 434 niter: 22 count: 2 breaks: 1 min_S: 705.63447 max_S: 715.77940 S: 708.21127 ΔS: 1.19361 moves: 447 niter: 23 count: 0 breaks: 1 min_S: 703.12325 max_S: 715.77940 S: 703.12325 ΔS: -5.08802 moves: 454 niter: 24 count: 0 breaks: 1 min_S: 703.05106 max_S: 715.77940 S: 703.05106 ΔS: -0.0721911 moves: 433 niter: 25 count: 1 breaks: 1 min_S: 703.05106 max_S: 715.77940 S: 704.77370 ΔS: 1.72264 moves: 423 niter: 26 count: 0 breaks: 1 min_S: 701.61368 max_S: 715.77940 S: 701.61368 ΔS: -3.16003 moves: 441 niter: 27 count: 0 breaks: 1 min_S: 701.61368 max_S: 721.54373 S: 721.54373 ΔS: 19.9301 moves: 434 niter: 28 count: 1 breaks: 1 min_S: 701.61368 max_S: 721.54373 S: 703.33612 ΔS: -18.2076 moves: 439 niter: 29 count: 2 breaks: 1 min_S: 701.61368 max_S: 721.54373 S: 710.79425 ΔS: 7.45813 moves: 437 niter: 30 count: 3 breaks: 1 min_S: 701.61368 max_S: 721.54373 S: 706.35044 ΔS: -4.44381 moves: 429 niter: 31 count: 4 breaks: 1 min_S: 701.61368 max_S: 721.54373 S: 713.56014 ΔS: 7.20970 moves: 463 niter: 32 count: 5 breaks: 1 min_S: 701.61368 max_S: 721.54373 S: 720.16436 ΔS: 6.60422 moves: 445 niter: 33 count: 6 breaks: 1 min_S: 701.61368 max_S: 721.54373 S: 714.76845 ΔS: -5.39591 moves: 404 niter: 34 count: 7 breaks: 1 min_S: 701.61368 max_S: 721.54373 S: 703.21572 ΔS: -11.5527 moves: 410 niter: 35 count: 0 breaks: 1 min_S: 701.53898 max_S: 721.54373 S: 701.53898 ΔS: -1.67675 moves: 434 niter: 36 count: 1 breaks: 1 min_S: 701.53898 max_S: 721.54373 S: 708.14043 ΔS: 6.60146 moves: 433 niter: 37 count: 2 breaks: 1 min_S: 701.53898 max_S: 721.54373 S: 704.07209 ΔS: -4.06835 moves: 410 niter: 38 count: 3 breaks: 1 min_S: 701.53898 max_S: 721.54373 S: 704.76811 ΔS: 0.696023 moves: 413 niter: 39 count: 4 breaks: 1 min_S: 701.53898 max_S: 721.54373 S: 703.54823 ΔS: -1.21988 moves: 398 niter: 40 count: 5 breaks: 1 min_S: 701.53898 max_S: 721.54373 S: 713.59891 ΔS: 10.0507 moves: 388 niter: 41 count: 6 breaks: 1 min_S: 701.53898 max_S: 721.54373 S: 704.40168 ΔS: -9.19724 moves: 403 niter: 42 count: 7 breaks: 1 min_S: 701.53898 max_S: 721.54373 S: 707.57723 ΔS: 3.17556 moves: 400 niter: 43 count: 8 breaks: 1 min_S: 701.53898 max_S: 721.54373 S: 704.09679 ΔS: -3.48044 moves: 423 niter: 44 count: 9 breaks: 1 min_S: 701.53898 max_S: 721.54373 S: 704.64514 ΔS: 0.548354 moves: 419 niter: 45 count: 10 breaks: 2 min_S: 701.53898 max_S: 721.54373 S: 715.92329 ΔS: 11.2781 moves: 411 Note that the value of wait above was made purposefully low so that the output would not be overly long. The most appropriate value requires ... ... @@ -275,8 +275,8 @@ network as above. .. testoutput:: nested-model-averaging Change in description length: 2.371018... Number of accepted vertex moves: 56087 Change in description length: 20.223115... Number of accepted vertex moves: 58320 Similarly to the the non-nested case, we can use :func:~graph_tool.inference.mcmc.mcmc_equilibrate` to do most of the boring ... ...
 ... ... @@ -1837,9 +1837,9 @@ class Graph(object): -------- >>> g = gt.random_graph(6, lambda: 1, directed=False) >>> g.get_edges() array([[2, 1, 2], [3, 4, 0], [5, 0, 1]], dtype=uint64) array([[0, 3, 2], [1, 4, 1], [2, 5, 0]], dtype=uint64) """ edges = libcore.get_edge_list(self.__graph) E = edges.shape[0] // 3 ... ...
 ... ... @@ -112,16 +112,10 @@ def local_clustering(g, prop=None, undirected=True): Examples -------- .. testcode:: :hide: np.random.seed(42) gt.seed_rng(42) >>> g = gt.random_graph(1000, lambda: (5,5)) >>> g = gt.collection.data["karate"] >>> clust = gt.local_clustering(g) >>> print(gt.vertex_average(g, clust)) (0.008177777777777779, 0.00042080229075093...) (0.5706384782..., 0.05869813676...) References ---------- ... ... @@ -174,15 +168,9 @@ def global_clustering(g): Examples -------- .. testcode:: :hide: np.random.seed(42) gt.seed_rng(42) >>> g = gt.random_graph(1000, lambda: (5,5)) >>> g = gt.collection.data["karate"] >>> print(gt.global_clustering(g)) (0.008177777777777779, 0.0004212235142651...) (0.2556818181..., 0.06314746595...) References ---------- ... ... @@ -252,22 +240,16 @@ def extended_clustering(g, props=None, max_depth=3, undirected=False): Examples -------- .. testcode:: :hide: np.random.seed(42) gt.seed_rng(42) >>> g = gt.random_graph(1000, lambda: (5,5)) >>> g = gt.collection.data["karate"] >>> clusts = gt.extended_clustering(g, max_depth=5) >>> for i in range(0, 5): ... print(gt.vertex_average(g, clusts[i])) ... (0.0050483333333333335, 0.0004393940240073...) (0.024593787878787878, 0.0009963004021144...) (0.11238924242424242, 0.001909615401971...) (0.40252272727272725, 0.003113987400030...) (0.43629378787878786, 0.003144159256565...) (0.5706384782076..., 0.05869813676256...) (0.3260389360735..., 0.04810773205917...) (0.0530678759917..., 0.01513061504691...) (0.0061658977316..., 0.00310690511463...) (0.0002162629757..., 0.00021305890271...) References ---------- ... ... @@ -349,9 +331,9 @@ def motifs(g, k, p=1.0, motif_list=None, return_maps=False): >>> g = gt.random_graph(1000, lambda: (5,5)) >>> motifs, counts = gt.motifs(gt.GraphView(g, directed=False), 4) >>> print(len(motifs)) 18 11 >>> print(counts) [115557, 390005, 627, 700, 1681, 2815, 820, 12, 27, 44, 15, 7, 12, 4, 6, 1, 2, 1] [116386, 392916, 443, 507, 2574, 1124, 741, 5, 5, 8, 2] References ---------- ... ... @@ -520,9 +502,9 @@ def motif_significance(g, k, n_shuffles=100, p=1.0, motif_list=None, >>> g = gt.random_graph(100, lambda: (3,3)) >>> motifs, zscores = gt.motif_significance(g, 3) >>> print(len(motifs)) 11 12 >>> print(zscores) [0.22728646681107012, 0.21409572051644973, 0.007022040788902111, 0.5872141967123348, -0.37770179603294357, -0.3484733504783734, 0.8861811801325502, -0.08, -0.2, -0.38, -0.2] [2.59252643351441, 2.5966529814390387, 2.3459237708258587, -1.0829180621127024, -1.3368754665984663, -2.33027728409781, -3.055817397993647, -0.1, -0.15, -0.19, -0.4, -0.01] References ---------- ... ...
 ... ... @@ -2067,7 +2067,7 @@ class BlockState(object): ... ret = state.mcmc_sweep(niter=10) ... pe = state.collect_edge_marginals(pe) >>> gt.bethe_entropy(g, pe)[0] -0.901611... 12.204791... """ if p is None: ... ... @@ -2124,7 +2124,7 @@ class BlockState(object): ... ret = state.mcmc_sweep(niter=10) ... pv = state.collect_vertex_marginals(pv) >>> gt.mf_entropy(g, pv) 26.887021... 16.904653... >>> gt.graph_draw(g, pos=g.vp["pos"], vertex_shape="pie", ... vertex_pie_fractions=pv, output="polbooks_blocks_soft_B4.pdf") <...> ... ... @@ -2193,8 +2193,7 @@ class BlockState(object): ... ret = state.mcmc_sweep(niter=10) ... ph = state.collect_partition_histogram(ph) >>> gt.microstate_entropy(ph) 129.330077... 137.024741... """ if h is None: ... ...
 ... ... @@ -374,7 +374,7 @@ def incidence(g, vindex=None, eindex=None): >>> print(m.todense()) [[-1. -1. 0. ... 0. 0. 0.] [ 0. 0. 0. ... 0. 0. 0.] [ 0. 0. 0. ... 0. 0. 0.] [ 1. 0. 0. ... 0. 0. 0.] ... [ 0. 0. -1. ... 0. 0. 0.] [ 0. 0. 0. ... 0. 0. 0.] ... ...
 ... ... @@ -113,8 +113,8 @@ def vertex_hist(g, deg, bins=[0, 1], float_count=True): >>> from numpy.random import poisson >>> g = gt.random_graph(1000, lambda: (poisson(5), poisson(5))) >>> print(gt.vertex_hist(g, "out")) [array([ 7., 33., 91., 145., 165., 164., 152., 115., 62., 29., 28., 6., 1., 1., 0., 1.]), array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16], [array([ 5., 32., 85., 148., 152., 182., 160., 116., 53., 25., 23., 13., 3., 2., 1.]), array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15], dtype=uint64)] """ ... ... @@ -178,7 +178,7 @@ def edge_hist(g, eprop, bins=[0, 1], float_count=True): >>> eprop = g.new_edge_property("double") >>> eprop.get_array()[:] = random(g.num_edges()) >>> print(gt.edge_hist(g, eprop, linspace(0, 1, 11))) [array([501., 441., 478., 480., 506., 494., 507., 535., 499., 559.]), array([0. , 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1. ])] [array([485., 538., 502., 505., 474., 497., 544., 465., 492., 498.]), array([0. , 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1. ])] """ ... ... @@ -232,7 +232,7 @@ def vertex_average(g, deg): >>> from numpy.random import poisson >>> g = gt.random_graph(1000, lambda: (poisson(5), poisson(5))) >>> print(gt.vertex_average(g, "in")) (4.96, 0.0679882342762334) (4.986, 0.07323799560337517) """ if isinstance(deg, PropertyMap) and "string" in deg.value_type(): ... ... @@ -294,7 +294,7 @@ def edge_average(g, eprop): >>> eprop = g.new_edge_property("double") >>> eprop.get_array()[:] = random(g.num_edges()) >>> print(gt.edge_average(g, eprop)) (0.49888156584192045, 0.004096739923418754) (0.5027850372071281, 0.004073940886690715) """ if "string" in eprop.value_type(): ... ... @@ -427,10 +427,10 @@ def distance_histogram(g, weight=None, bins=[0, 1], samples=None, >>> g = gt.random_graph(100, lambda: (3, 3)) >>> hist = gt.distance_histogram(g) >>> print(hist) [array([ 0., 300., 880., 2269., 3974., 2358., 119.]), array([0, 1, 2, 3, 4, 5, 6, 7], dtype=uint64)] [array([ 0., 300., 862., 2195., 3850., 2518., 175.]), array([0, 1, 2, 3, 4, 5, 6, 7], dtype=uint64)] >>> hist = gt.distance_histogram(g, samples=10) >>> print(hist) [array([ 0., 30., 87., 223., 394., 239., 17.]), array([0, 1, 2, 3, 4, 5, 6, 7], dtype=uint64)] [array([ 0., 30., 86., 213., 378., 262., 21.]), array([0, 1, 2, 3, 4, 5, 6, 7], dtype=uint64)] """ if samples is not None: ... ...
 ... ... @@ -187,9 +187,9 @@ def similarity(g1, g2, eweight1=None, eweight2=None, label1=None, label2=None, >>> gt.similarity(u, g) 1.0 >>> gt.random_rewire(u) 22 17 >>> gt.similarity(u, g) 0.04666666666666667 0.05 """ ... ... @@ -890,11 +890,11 @@ def dominator_tree(g, root, dom_map=None): >>> root = [v for v in g.vertices() if v.in_degree() == 0] >>> dom = gt.dominator_tree(g, root[0]) >>> print(dom.a) [ 0 0 0 0 0 0 0 74 0 0 0 97 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 97 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 64 67 0 0 67 0 0 74 0 0 0 0 23 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7 0 0] [ 0 0 0 0 0 0 36 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 66 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0] References ---------- ... ... @@ -941,8 +941,8 @@ def topological_sort(g): >>> g.set_edge_filter(tree) >>> sort = gt.topological_sort(g) >>> print(sort) [28 26 29 27 23 22 18 17 16 20 21 15 12 11 10 25 14 9 8 7 5 3 2 24 4 6 1 0 19 13] [28 26 29 25 24 22 21 19 17 16 14 13 10 27 8 5 4 20 12 15 9 2 18 1 0 6 23 11 7 3] References ---------- ... ... @@ -1050,18 +1050,18 @@ def label_components(g, vprop=None, directed=None, attractors=False): >>> g = gt.random_graph(100, lambda: (poisson(2), poisson(2))) >>> comp, hist, is_attractor = gt.label_components(g, attractors=True) >>> print(comp.a) [ 9 9 9 9 10 1 9 11 12 9 9 9 9 9 9 13 9 9 9 0 9 9 16 9 9 3 9 9 4 17 9 9 18 9 9 19 20 9 9 9 14 5 9 9 6 9 9 9 21 9 9 9 9 9 9 9 9 9 9 9 9 9 9 2 9 8 9 22 15 9 9 9 9 9 23 25 9 9 26 27 28 29 30 9 9 9 9 9 9 31 9 9 9 9 9 32 9 9 7 24] [ 1 17 4 18 17 19 17 17 17 17 17 15 17 17 20 17 17 17 14 17 0 21 17 17 22 23 16 24 17 17 17 17 25 10 17 17 17 2 27 17 6 13 17 17 17 5 12 17 17 17 26 9 17 17 17 7 17 28 29 17 17 3 30 17 17 17 17 17 17 17 17 17 17 17 17 17 31 17 17 17 17 17 17 17 8 32 17 11 17 17 17 17 33 17 17 17 17 17 17 17] >>> print(hist) [ 1 1 1 1 1 1 1 1 1 68 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1] [ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 67 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1] >>> print(is_attractor) [ True True True True True True True True True False True False False False False False False False False False False False False False False False False False True False True False False] [ True False True True False True True False True False True True True True False False True False False False False False False False True False False False False False True False False False] """ if vprop is None: ... ... @@ -1122,12 +1122,12 @@ def label_largest_component(g, directed=None): >>> g = gt.random_graph(100, lambda: poisson(1), directed=False) >>> l = gt.label_largest_component(g) >>> print(l.a) [0 0 0 0 1 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 0 1 0 0 1 0 0 0 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 1 0 0] [1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 1 0 0 1 1 1 0 0] >>> u = gt.GraphView(g, vfilt=l) # extract the largest component as a graph >>> print(u.num_vertices()) 18 14 """ label = g.new_vertex_property("bool") ... ... @@ -1173,9 +1173,9 @@ def label_out_component(g, root, label=None): >>> g = gt.random_graph(100, lambda: poisson(2.2), directed=False) >>> l = gt.label_out_component(g, g.vertex(2)) >>> print(l.a) [1 1 1 1 1 1 0 1 1 1 0 0 1 1 1 1 1 1 0 1 1 1 0 1 1 1 1 1 1 1 0 0 0 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 0 1 1 1 1 0 0 1 0 1 1 1 0 1 1 1 0 0 1 1 1 1 1 1 1 1 1 0 1 1 1 0 1 1 1 0 1 1 1 0] [1 1 1 1 1 0 0 1 1 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 0 1 1 1 0 1 0 1 1 0 0 1 1 1 1 1 1 0 0 1 0 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 1 1 1 0] The in-component can be obtained by reversing the graph. ... ... @@ -1260,19 +1260,19 @@ def label_biconnected_components(g, eprop=None, vprop=None): >>> g = gt.random_graph(100, lambda: poisson(2), directed=False) >>> comp, art, hist = gt.label_biconnected_components(g) >>> print(comp.a) [26 26 26 26 26 26 26 26 19 25 26 26 23 26 26 26 26 6 26 24 18 26 26 13 26 26 26 26 26 26 26 26 26 26 26 16 29 26 26 26 26 26 26 15 26 26 26 26 26 0 26 26 12 2 26 26 26 26 26 26 26 26 9 3 26 28 26 26 8 26 4 26 26 26 14 26 26 26 26 30 11 26 26 26 20 26 26 27 26 33 26 22 17 7 5 32 21 26 1 10 31] [45 45 45 45 45 21 45 27 14 45 45 45 45 45 17 45 45 45 45 45 45 45 19 20 45 45 44 45 42 37 45 45 45 45 45 45 38 24 28 45 45 45 45 29 45 6 34 45 2 45 45 9 45 36 33 30 45 45 45 11 23 45 47 45 45 45 45 25 1 48 12 39 18 7 31 40 45 45 13 5 4 45 16 45 8 45 0 45 26 22 10 46 3 50 41 49 43 32 45 35 15] >>> print(art.a) [1 0 1 1 0 0 0 0 0 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 1 1 0 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 1 1 0 0 1 0 0 1 1 0 0 0 1 0 0 1 0 0 0] [1 0 0 1 0 1 0 0 1 1 0 0 0 1 0 1 1 1 0 0 1 0 0 0 0 0 0 0 0 1 1 0 0 1 0 0 0 1 0 0 0 1 1 0 0 1 1 0 0 1 1 0 0 1 1 1 0 1 0 0 0 0 1 1 0 1 1 0 0 0 0 0 0 1 0 0 1 0 0 1 0 1 1 0 0 1 1 0 0 1 0 0 0 0 0 0 1 1 0 0] >>> print(hist) [ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 68 1 1 1 1 1 1 1] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 51 1 1 1 1 1] """ if vprop is None: ... ... @@ -1652,49 +1652,49 @@ def shortest_distance(g, source=None, target=None, weights=None, >>> g = gt.random_graph(100, lambda: (poisson(3), poisson(3))) >>> dist = gt.shortest_distance(g, source=g.vertex(0))