简体   繁体   English

Jena Fuseki API将新数据添加到现有数据集[java]

[英]Jena Fuseki API add new data to an exsisting dataset [java]

i was trying to upload an RDF/OWL file to my Sparql endpoint (given by Fuseki). 我试图将RDF / OWL文件上传到我的Sparql端点(由Fuseki提供)。 Right now i'm able to upload a single file, but if i try to repeat the action, the new dataset will override the old one. 现在,我可以上传一个文件,但是如果我尝试重复该操作,则新数据集将覆盖旧文件。 I'm searching a way to "merge" the content of the data in the dataset with the new ones of the rdf file just uploaded. 我正在寻找一种方法,用刚刚上传的新rdf文件“合并”数据集中的数据内容。 Anyone can help me? 有人可以帮助我吗? thanks. 谢谢。

Following the code to upload/query the endpoint (i'm not the author) 按照代码上载/查询端点(我不是作者)

// Written in 2015 by Thilo Planz 
// To the extent possible under law, I have dedicated all copyright and related and neighboring rights 
// to this software to the public domain worldwide. This software is distributed without any warranty. 
// http://creativecommons.org/publicdomain/zero/1.0/

import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.ByteArrayOutputStream;

import org.apache.jena.query.DatasetAccessor;
import org.apache.jena.query.DatasetAccessorFactory;
import org.apache.jena.query.QueryExecution;
import org.apache.jena.query.QueryExecutionFactory;
import org.apache.jena.query.QuerySolution;
import org.apache.jena.query.ResultSet;
import org.apache.jena.query.ResultSetFormatter;
import org.apache.jena.rdf.model.Model;
import org.apache.jena.rdf.model.ModelFactory;
import org.apache.jena.rdf.model.RDFNode;


class FusekiExample {

    public static void uploadRDF(File rdf, String serviceURI)
            throws IOException {

        // parse the file
        Model m = ModelFactory.createDefaultModel();
        try (FileInputStream in = new FileInputStream(rdf)) {
            m.read(in, null, "RDF/XML");
        }

        // upload the resulting model
        DatasetAccessor accessor = DatasetAccessorFactory.createHTTP(serviceURI);
        accessor.putModel(m);

    }

    public static void execSelectAndPrint(String serviceURI, String query) {
        QueryExecution q = QueryExecutionFactory.sparqlService(serviceURI,
                query);
        ResultSet results = q.execSelect();

        // write to a ByteArrayOutputStream
        ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
        //convert to JSON format
        ResultSetFormatter.outputAsJSON(outputStream, results);
        //turn json to string
        String json = new String(outputStream.toByteArray());
        //print json string
        System.out.println(json);

    }

    public static void execSelectAndProcess(String serviceURI, String query) {
        QueryExecution q = QueryExecutionFactory.sparqlService(serviceURI,
                query);
        ResultSet results = q.execSelect();

        while (results.hasNext()) {
            QuerySolution soln = results.nextSolution();
            // assumes that you have an "?x" in your query
            RDFNode x = soln.get("x");
            System.out.println(x);
        }
    }

    public static void main(String argv[]) throws IOException {
        // uploadRDF(new File("test.rdf"), );

        uploadRDF(new File("test.rdf"), "http://localhost:3030/MyEndpoint/data");


    }
}

Use accessor.add(m) instead of putModel(m) . 使用accessor.add(m)代替putModel(m) As you can see in the Javadoc , putModel replaces the existing data. 如您在JavadocputModelputModel 替换了现有数据。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM