Possible issues and solutions for loading specific Owl file

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Possible issues and solutions for loading specific Owl file

wanghewen
Hi,

I'm a researcher trying to do some information extraction tasks and tuple matching using NLP. Currently I'm facing problems when loading the rdf ontology file below:

onto = get_ontology("https://www.omg.org/spec/EDMC-FIBO/BE/About-BE.rdf").load()

Problem: The ontologies defined in owl:imports are not imported.
Reason: Trailing slash / is removed and it causes the entity with trailing slash and the entity without trailing slash become two different entities, i.e. https://www.omg.org/spec/EDMC-FIBO/BE/About-BE and https://www.omg.org/spec/EDMC-FIBO/BE/About-BE/ are two different entities.
Solution: I've changed line 494 in namespace.py (for development version) from
if new_base_iri.endswith("#") or new_base_iri.endswith("/"):
to
if new_base_iri.endswith("#") :
and it works fine. I'm not sure if there will be more problems.

Another problem:
It can not handle empty ontology file(with only a \n): http://www.omg.org/techprocess/ab/SpecificationMetadata/
Solution: I've changed line 165 in driver.py (for development version) from
if not line.startswith("#"):
to
if not line.startswith("#") and not line.startswith("\n"):

In fact I'd like to raise a pull request but I find the unit tests provided in test folder are hard to pass. And I cannot find relevant development documents to make sure my modifications are correct.. Definitely I'm looking forward to making some contributions if possible.

Let me know if these modifications are correct and won't cause problems.

Thank you.

Best regards,
Hewen
Reply | Threaded
Open this post in threaded view
|

Re: Possible issues and solutions for loading specific Owl file

Jiba
Administrator
Hi,

Thank you for these feedback.

If I remember well, for the first problem, the solution you propose remove a fix I added a few days ago for fixing another ontology... I need more time to analyze this problem.

Your second solution is Ok, I'm adding it in the development version.

What type of problems did you encounter with the unit tests ? The main test file is regtest.py in test/ directory.

Best regards,
Jiba
Reply | Threaded
Open this post in threaded view
|

Re: Possible issues and solutions for loading specific Owl file

wanghewen
Hi,

I thought it was test_parser.py.. Now it works fine. It will be great if you can add the owl file you mentioned inside the unit tests so we can check possible reasons.

Moreover, I did some refactor on source code folder structure to allow python setup.py develop and code coverage test. May I know if it is possible for you to open the fork access and allow me to submit a pull request?

Thank you.

Best regards,
Hewen
Reply | Threaded
Open this post in threaded view
|

Re: Possible issues and solutions for loading specific Owl file

Jiba
Administrator
Hi,

I looked deeper in your first problem and actually you were right. I was thinking about a previous fix, but it was not at this place.

I've integrated your fix in the development version, and added a test case Test.test_ontology_18 with file test_ontoslash.owl.


What do you changed in the folder structure ? I'm not very enthusiast at moving code and change the structure, because I'm used to it and I have dozen of other projects that use a similar general structure.

Best regards,
Jean-Baptiste Lamy
MCF, LIMICS, Université Paris 13
Reply | Threaded
Open this post in threaded view
|

Re: Possible issues and solutions for loading specific Owl file

wanghewen
Hi,

Basically what I have done is to move the source code files to a separate folder in order for setup.py to work well in development mode. By right it will not affect your existing imports. I have done some cherrypick and uploaded the modified repo in my bitbucket. You can accept my invitation and take a look at the following link. It's under branch hewen.

https://bitbucket.org/wanghewen/owlready2/src/hewen/

Thank you.

Best regards,
Hewen