简体   繁体   English

Sparql查询与推理

[英]Sparql Query With Inferencing

i have some rdf & rdfs files and i want to use jena sparql implementation to query it and my code look like : 我有一些rdf和rdfs文件,我想使用jena sparql实现来查询它,我的代码看起来像:

 //model of my rdf file 
 Model model = ModelFactory.createMemModelMaker().createDefaultModel();
 model.read(inputStream1, null);
//model of my ontology (word net) file
 Model onto = ModelFactory.createOntologyModel( OntModelSpec.RDFS_MEM_RDFS_INF);
 onto.read( inputStream2,null);
    String queryString =
                        "PREFIX rdf:<http://www.w3.org/1999/02/22-rdf-syntax-ns#> "
                        + "PREFIX wn:<http://www.webkb.org/theKB_terms.rdf/wn#> "
                        + "SELECT ?person "
                        + "WHERE {"
                        + "  ?person    rdf:type   wn:Person  . "
                        + "      }";

    Query query = QueryFactory.create(queryString);
    QueryExecution qe = QueryExecutionFactory.create(query, ????);
    ResultSet results = qe.execSelect();
    ResultSetFormatter.out(System.out, results, query);
    qe.close();

and i have a wordNet Ontology in rdf file and i want to use this ontology in my query to do Inferencing automaticly (when i query for person the query should return eg. Man ,Woman) so how to link the ontology to my query? 我在rdf文件中有一个wordNet本体,我想在我的查询中使用这个本体自动进行推理(当我查询查询的人应该返回例如。男人,女人)那么如何将本体链接到我的查询? please help me. 请帮我。

update: now i have tow models : from which i should run my query ? 更新:现在我有两个模型:我应该运行我的查询?

 QueryExecution qe = QueryExecutionFactory.create(query, ????);

thanks in advance. 提前致谢。

The key is to recognise that, in Jena, Model is the one of the central abstractions. 关键是要认识到,在耶拿, Model是中心抽象之一。 An inferencing model is just a Model , in which some of the triples are present because they are entailed by inference rules rather than read in from the source document. 推理模型只是一个Model ,其中存在一些三元组,因为它们由推理规则引入而不是从源文档中读入。 Thus you only need to change the first line of your example, where you create the model initially. 因此,您只需要更改示例的第一行,即最初创建模型的位置。

While you can create inference models directly, it's often easiest just to create an OntModel with the required degree of inference support: 虽然您可以直接创建推理模型 ,但创建具有所需推理支持度OntModel通常最简单:

Model model = ModelFactory.createOntologyModel( OntModelSpec.RDFS_MEM_RDFS_INF );

If you want a different reasoner, or OWL support, you can select a different OntModelSpec constant. 如果您需要不同的推理器或OWL支持,则可以选择不同的OntModelSpec常量。 Be aware that large and/or complex models can make for slow queries. 请注意,大型和/或复杂模型可能会导致查询速度变慢。

Update (following edit of original question) 更新 (编辑原始问题后)

To reason over two models, you want the union. 要推理两个模型,你需要联合。 You can do this through OntModel 's sub-model factility. 你可以通过OntModel的子模型实现这一点。 I would change your example as follows (note: I haven't tested this code, but it should work): 我会改变你的例子如下(注意:我没有测试过这段代码,但它应该可以工作):

String rdfFile = "... your RDF file location ...";
Model source = FileManager.get().loadModel( rdfFile );

String ontFile = "... your ontology file location ...";
Model ont = FileManager.get().loadModel( ontFile );

Model m = ModelFactory.createOntologyModel( OntModelSpec.RDFS_MEM_RDFS_INF, ont );
m.addSubModel( source );

String queryString =
                    "PREFIX rdf:<http://www.w3.org/1999/02/22-rdf-syntax-ns#> "
                    + "PREFIX wn:<http://www.webkb.org/theKB_terms.rdf/wn#> "
                    + "SELECT ?person "
                    + "WHERE {"
                    + "  ?person    rdf:type   wn:Person  . "
                    + "      }";

Query query = QueryFactory.create(queryString);
QueryExecution qe = QueryExecutionFactory.create(query, m);
ResultSet results = qe.execSelect();
ResultSetFormatter.out(System.out, results, query);
qe.close();

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM