[英]elasticsearch bulk insert JSON file
I have the following JSON file我有以下JSON 文件
I have used awk to get rid of empty spaces, trailing, next line我用 awk 去掉了空格、尾随、下一行
awk -v ORS= -v OFS= '{$1=$1}1' data.json
I have added a create request at the top of my data.json followed by \\n and the rest of my data.我在 data.json 的顶部添加了一个创建请求,然后是 \\n 和我的其余数据。
{"create": {"_index":"socteam", "_type":"products"}}
When I issue bulk submit request, I get the following error当我发出批量提交请求时,出现以下错误
CURL -XPUT http://localhost:9200/_bulk
{
"took": 1,
"errors": true,
"items": [
{
"create": {
"_index": "socteam",
"_type": "products",
"_id": "AVQuGPff-1Y7OIPIJaLX",
"status": 400,
"error": {
"type": "mapper_parsing_exception",
"reason": "failed to parse",
"caused_by": {
"type": "not_x_content_exception",
"reason": "Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"
}
}
}
}
]
Any idea on what this error mean?知道这个错误是什么意思吗? I haven't created any mapping, I'm using vanilla elasticsearch
我还没有创建任何映射,我使用的是 vanilla elasticsearch
Accordingly to this doc, you have to specify index and type in URL:根据此文档,您必须指定索引并输入 URL:
curl -XPUT 'localhost:9200/socteam/products/_bulk?pretty' --data-binary "@data.json"
It works for PUT and POST methods.它适用于 PUT 和 POST 方法。
And your data.json file should have structure like:您的 data.json 文件应具有如下结构:
{"index":{"_id":"1"}}
{"name": "John Doe" }
{"index":{"_id":"2"}}
{"name": "Jane Doe" }
Maybe there present another method to import data, but i know just this... Hope it'll help...也许存在另一种导入数据的方法,但我只知道这一点......希望它会有所帮助......
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.