简体   繁体   English

使用自定义分析器在Elasticsearch上创建索引时出错

[英]Error while creating an index on Elasticsearch with custom analyzer

I am trying to create an index with a custom default analyzer. 我正在尝试使用自定义默认分析器创建索引。 I already checked the following questions: 我已经检查了以下问题:

but they didn't solve the issue. 但是他们没有解决问题。

Here is my schema: 这是我的架构:

put /emails
{
   "mappings": {
      "email": {
         "analyzer": "lkw",
         "properties": {
            "createdOn": {
               "type": "date",
               "store": true,
               "format": "strict_date_optional_time||epoch_millis"
            },
            "data": {
               "type": "object",
               "dynamic": "true"
            },
            "from": {
               "type": "string",
               "store": true
            },
            "id": {
               "type": "string",
               "store": true
            },
            "sentOn": {
               "type": "date",
               "store": true,
               "format": "strict_date_optional_time||epoch_millis"
            },
            "sesId": {
               "type": "string",
               "store": true
            },
            "subject": {
               "type": "string",
               "store": true,
               "analyzer": "standard"
            },
            "templates": {
               "properties": {
                  "html": {
                     "type": "string",
                     "store": true
                  },
                  "plainText": {
                     "type": "string",
                     "store": true
                  }
               }
            },
            "to": {
               "type": "string",
               "store": true
            },
            "type": {
               "type": "string",
               "store": true
            }
         }
      },
      "event": {
         "_parent": {
            "type": "email"
         },
         "analyzer": "lkw",
         "properties": {
            "id": {
               "type": "string",
               "store": true
            },
            "origin": {
               "type": "string",
               "store": true
            },
            "time": {
               "type": "date",
               "store": true,
               "format": "strict_date_optional_time||epoch_millis"
            },
            "type": {
               "type": "string",
               "store": true
            },
            "userAgent": {
               "type": "string",
               "store": true
            }
         }
      }
   },
   "settings": {
      "analysis": {
         "analyzer": {
            "lkw": {
               "tokenizer": "keyword",
               "filter": [
                  "lowercase"
               ],
               "type": "custom"
            }
         }
      }
   }
}

When I execute the command above, I get this error: 当我执行上面的命令时,出现此错误:

{
       "error": {
          "root_cause": [
             {
                "type": "mapper_parsing_exception",
                "reason": "Root mapping definition has unsupported parameters:  [analyzer : lkw]"
             }
          ],
          "type": "mapper_parsing_exception",
          "reason": "Failed to parse mapping [event]: Root mapping definition has unsupported parameters:  [analyzer : lkw]",
          "caused_by": {
             "type": "mapper_parsing_exception",
             "reason": "Root mapping definition has unsupported parameters:  [analyzer : lkw]"
          }
       },
       "status": 400
    }

Since you have only a few string fields, I suggest you simply specify your lkw analyzer where you need it, just like you did for the standard one: 由于您只有几个字符串字段,因此建议您仅在需要的地方指定lkw分析器,就像对standard分析器所做的那样:

PUT /emails
{
   "mappings": {
      "email": {
         "properties": {
            "createdOn": {
               "type": "date",
               "store": true,
               "format": "strict_date_optional_time||epoch_millis"
            },
            "data": {
               "type": "object",
               "dynamic": "true"
            },
            "from": {
               "type": "string",
               "store": true,
               "analyzer": "lkw"
            },
            "id": {
               "type": "string",
               "store": true,
               "analyzer": "lkw"
            },
            "sentOn": {
               "type": "date",
               "store": true,
               "format": "strict_date_optional_time||epoch_millis"
            },
            "sesId": {
               "type": "string",
               "store": true,
               "analyzer": "lkw"
            },
            "subject": {
               "type": "string",
               "store": true,
               "analyzer": "standard"
            },
            "templates": {
               "properties": {
                  "html": {
                     "type": "string",
                     "store": true,
                     "analyzer": "lkw"
                  },
                  "plainText": {
                     "type": "string",
                     "store": true,
                     "analyzer": "lkw"
                  }
               }
            },
            "to": {
               "type": "string",
               "store": true,
               "analyzer": "lkw"
            },
            "type": {
               "type": "string",
               "store": true,
               "analyzer": "lkw"
            }
         }
      },
      "event": {
         "_parent": {
            "type": "email"
         },
         "properties": {
            "id": {
               "type": "string",
               "store": true,
               "analyzer": "lkw"
            },
            "origin": {
               "type": "string",
               "store": true,
               "analyzer": "lkw"
            },
            "time": {
               "type": "date",
               "store": true,
               "format": "strict_date_optional_time||epoch_millis"
            },
            "type": {
               "type": "string",
               "store": true,
               "analyzer": "lkw"
            },
            "userAgent": {
               "type": "string",
               "store": true,
               "analyzer": "lkw"
            }
         }
      }
   },
   "settings": {
      "analysis": {
         "analyzer": {
            "lkw": {
               "tokenizer": "keyword",
               "filter": [
                  "lowercase"
               ],
               "type": "custom"
            }
         }
      }
   }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM