Skip to content

Using lambda function in combine_vertices dictionary of cluster_graph causes memory leak #390

@Caeph

Description

@Caeph

Hello,
When using lambda function in the combine_vertices attribute of cluster_graph method (see code) I discovered a memory leak: after deleting the condensed graph, a section of memory is still unreachable. If no lambda function is used, the memory leak does not happen.

strongly_connected = graph.clusters(mode="strong")
condensed = strongly_connected.cluster_graph( combine_vertices=dict( name=lambda x: x, min_count='min', orig_id_graph=lambda x: x, ) )

I needed to create a graph condensation on a large graph (~10^6 vertices) several times (10000+), and by trying to do so, I caused a memory overflow (on a 32GB RAM).
From the documentation, I was under the impression that this functionality is supported. I am able to surpass this by using an external dictionary, however, I thought you should know about the leak.

Thank you for an otherwise great package.

Version information:
I installed python-igraph 0.9.1 via pip and I am using Python 3.8.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions