motif on large scale (about 1 milion nodes and 10 milion edges)
i have a problem to run motif on large scale cause it take alot of memory and crush:
- motif_size = 3 (it take 10G ram and hold)
- motif_size =4 (it crush when it about 30G..) i use the function result = gt.clustering.motifs(ggt, motif_list=motifs_veriations, k=4, return_maps=True)
do you have an idea to help?