简体   繁体   English

将JSON对象作为文档批量加载到Elasticsearch中

[英]Bulk loading JSON object as document into elasticsearch

Is there a way to bulk load the data below into elasticsearch without modifying the original content? 有没有办法在不修改原始内容的情况下将以下数据批量加载到elasticsearch中? I POST each object to be a single document. 我将每个对象发布为单个文档。 At the moment I'm using Python to parse through individual objects and POST them one at a time. 目前,我正在使用Python解析单个对象,然后一次将其发布。

{
   {"name": "A"},
   {"name": "B"},
   {"name": "C"},
   {"name": "D"},
}

Doing this type of processing in production from REST servers into elasticsearch is taking a lot of time. 从REST服务器到Elasticsearch的生产中进行这种类型的处理需要大量时间。

Is there a single POST/curl command that can upload the file above at once and elasticsearch parses it and makes each object into its own document? 是否有单个POST / curl命令可以一次上传上面的文件,然后Elasticsearch对其进行解析并将每个对象放入自己的文档中?

We're using elasticsearch 1.3.2 我们正在使用Elasticsearch 1.3.2

Yes, you can do bulk api via curl by using the _bulk endpoint. 是的,您可以使用_bulk端点通过curl进行批量api。 But not custom parsing. 但不是自定义解析。 Whatever process that creates the file can format it to ES specification if that is an option. 如果可以的话,创建文件的任何过程都可以将其格式化为ES规范。 See here: 看这里:

http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/docs-bulk.html http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/docs-bulk.html

There is also bulk support in python via helper. python中还通过助手提供了批量支持。 See here: http://elasticsearch-py.readthedocs.org/en/master/helpers.html 参见此处: http : //elasticsearch-py.readthedocs.org/en/master/helpers.html

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM