Hello,
I am an ontology developer based in Dallas, TX and have been using the Owlready2 library for ontology reasoning.
I am trying to reason over large ontologies (10M+ triples, 1M+ individuals) mainly using SWRL rules with the goal of populating the graph with inferable object properties between already existing individuals.
I am able to successfully perform this task over small ontologies (<=15K triples, 1.5K individuals) by bumping up the max Java memory heap to 4GB, however any larger number of triples will cause a Java OutOfMemory error, and I'd like to not have to worry about a memory cap.
Does anyone know of a better way to use Owlready2 to conduct such large-scale reasoning?
Any help is greatly appreciated!