I am an ontology developer based in Dallas, TX and have been using the Owlready2 library for ontology reasoning.
I am trying to reason over large ontologies (10M+ triples, 1M+ individuals) mainly using SWRL rules with the goal of populating the graph with inferable object properties between already existing individuals.
I am able to successfully perform this task over small ontologies (<=15K triples, 1.5K individuals) by bumping up the max Java memory heap to 4GB, however any larger number of triples will cause a Java OutOfMemory error, and I'd like to not have to worry about a memory cap.
Does anyone know of a better way to use Owlready2 to conduct such large-scale reasoning?
Ontology reasoning is a very complex task, and the complexity increase exponentially with the number of entity. I fear there is no current tool able to perform reasoning on huge ontologies, at least not in the staisfying time.
The common solution is to use SPARQL query, which are much faster, but less expressive and more limited, especially when classifying classes. Since your ontology seems to have a hug number of individuals (and not classes), using SPARQL queries should be doable.