简体   繁体   English

使用 OWL API 推理本体

[英]Reasoning an ontology using OWL API

I have used OWL API 4.1.3 to load my ontology which is not big.我已经使用 OWL API 4.1.3 来加载我的本体,它并不大。 As I need to use inferred information I also carried out reasoning using Hermit 1.3.8.413 library.由于我需要使用推断信息,我还使用 Hermit 1.3.8.413 库进行了推理。 The following code shows how I have done it.以下代码显示了我是如何做到的。

public class ReasonRDF {

public static void main(String[] args) throws OWLOntologyCreationException, OWLOntologyStorageException {

    readRDF("C:/Users/workspace/Ontology_matching/NVDB_Matching_v18_H_4_1_CONVERTYING/results/NewInstantiated/owl/OSM1.owl");

}
public static void readRDF(String address) throws OWLOntologyCreationException, OWLOntologyStorageException{
    OWLOntologyManager manager =OWLManager.createOWLOntologyManager();
    File file = new File (address);
    OWLOntology ont = manager.loadOntologyFromOntologyDocument(IRI.create(file));
    System.out.println("Ontology Loaded...");

    System.out.println("Logical IRI   : " + ont.getOntologyID());
    System.out.println("Format        : " + manager.getOntologyFormat(ont));
    System.out.println("Runtime memory: " + Runtime.getRuntime().totalMemory());      
      ReasonerFactory reasonerFactory = new ReasonerFactory();
      ConsoleProgressMonitor progressMonitor = new ConsoleProgressMonitor();
      Configuration config = new Configuration();
      config.ignoreUnsupportedDatatypes=true;
      config.reasonerProgressMonitor= progressMonitor;
      OWLReasoner reasoner = reasonerFactory.createReasoner(ont, config);


      long t0 = System.nanoTime();

      System.out.println("Starting to add axiom generators");
      OWLDataFactory datafactory = manager.getOWLDataFactory();
      List<InferredAxiomGenerator<? extends OWLAxiom>> inferredAxioms = new ArrayList<InferredAxiomGenerator<? extends OWLAxiom>>();
      //inferredAxioms.add(new InferredSubClassAxiomGenerator());
      inferredAxioms.add(new InferredClassAssertionAxiomGenerator());
      //inferredAxioms.add(new InferredDataPropertyCharacteristicAxiomGenerator());
      //inferredAxioms.add(new InferredObjectPropertyCharacteristicAxiomGenerator());
      //inferredAxioms.add(new InferredEquivalentClassAxiomGenerator());
      //inferredAxioms.add(new InferredPropertyAssertionGenerator());
      //inferredAxioms.add(new InferredInverseObjectPropertiesAxiomGenerator());         
      inferredAxioms.add(new InferredSubDataPropertyAxiomGenerator());
      inferredAxioms.add(new InferredSubObjectPropertyAxiomGenerator());
      System.out.println("finished adding axiom generators");

//        List<InferredIndividualAxiomGenerator<? extends OWLIndividualAxiom>> individualAxioms= new ArrayList<InferredIndividualAxiomGenerator<? extends OWLIndividualAxiom>>();
//        inferredAxioms.addAll(individualAxioms);

    // for writing inferred axioms to the new ontology
    OWLOntology infOnt = manager.createOntology(IRI.create(ont.getOntologyID().getOntologyIRI().get()+"_inferred"));

      // use generator and reasoner to infer some axioms
      System.out.println("Starting to infer");
      InferredOntologyGenerator iog = new InferredOntologyGenerator(reasoner, inferredAxioms);
      //InferredOntologyGenerator iog = new InferredOntologyGenerator(reasoner);

      System.out.println("Inferrence is over");

      System.out.println("Storing the results");
      iog.fillOntology(datafactory,infOnt);
      System.out.println("Results are stored");
      long elapsed_time = System.nanoTime()-t0;
      System.out.println(elapsed_time);

      // save the ontology
      manager.saveOntology(infOnt, IRI.create("file:///C:/Users/ontologies/NVDB4_test.rdf"));
    }
}

It does not throw any error but it takes for ever to store the inferred ontology in a new file.它不会抛出任何错误,但需要永远将推断的本体存储在新文件中。 In fact it does not complete the job even after 2 days.事实上,即使在 2 天后它也没有完成工作。 My IDE is eclipse EE and I have given 6 to 12 GB memory to run this application.我的 IDE 是 eclipse EE,我提供了 6 到 12 GB 的内存来运行这个应用程序。 I can't find any problem with my code or my ontology.我的代码或本体找不到任何问题。

Could someone suggest an optimization or maybe even a better way of implementation or another api?有人可以建议优化甚至更好的实现方式或其他api吗?

here is my ontology in case someone wants to test it. 是我的本体,以防有人想测试它。

The size of an ontology is only loosely related to the complexity of reasoning on it - some small ontologies are much harder for reasoners than other very large ones.本体的大小仅与推理的复杂性松散相关——对于推理者来说,一些小的本体比其他非常大的本体要困难得多。 (Of course there's also the possibility of a bug). (当然也有可能出现错误)。

Is it possible for you to share the ontology contents?请问可以分享本体内容吗?

Edit: Having tried the ontology, it looks like size does not matter that much;编辑:尝试了本体之后,看起来大小并不重要; the ontology is proving quite hard to reason with.事实证明,本体很难推理。

I have tried disabling the SWRL rules and skipping the class assertion generation, and still hit a roadblock.我曾尝试禁用 SWRL 规则并跳过类断言生成,但仍然遇到了障碍。 The number and topology of object properties is enough to stress HermiT hard.对象属性的数量和拓扑结构足以让 HermiT 难以承受。

I have tried version 1.3.8.500, in case of any issues in OWLAPI that might have been fixed in updated versions;我已经尝试了 1.3.8.500 版本,以防 OWLAPI 中的任何问题可能已在更新版本中修复; the only significant result I got is that the code is not running memory bound.我得到的唯一重要结果是代码没有运行内存限制。 3 Gigabytes of RAM assigned to the VM seem to be more than enough.分配给 VM 的 3 GB RAM 似乎绰绰有余。

Disjointness related reasoning seems to be taking a large amount of time - this is not unexpected.与不相交相关的推理似乎要花费大量时间——这并不意外。 Consider if you can remove disjoint axioms from your ontology and still achieve your requirements.考虑是否可以从本体中删除不相交的公理并仍然满足您的要求。

Also consider if it is meaningful to separate the individuals by partitioning the ABox - if there are individuals that you are sure are not related, it might be good to separate the assertions in multiple ontologies.还要考虑通过对 ABox 进行分区来分离个体是否有意义 - 如果存在您确定不相关的个​​体,则最好在多个本体中分离断言。 Large numbers of unrelated individuals might be causing the reasoner to attempt reasoning paths that will never provide useful inferences.大量不相关的个​​体可能会导致推理者尝试永远无法提供有用推理的推理路径。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM