简体   繁体   中英

Extend Elasticsearch's standard Analyzer with additional characters to tokenize on

I basically want the functionality of the inbuilt standard analyzer that additionally tokenizes on underscores.

Currently the standard analyzer will keep brown_fox_has as a singular token but I want [brown, fox, has] instead. The simple analyzer loses some functionality over the standard one, so I want to keep the standard as much as possible.

The docs only shows how you would add filters and other non-tokenizer changes, but I want to keep all of the standard tokenizer, while adding the additional underscore.

I could create a character filter to map _ to - and the standard tokenizer will do the job for me, but is there a better way?

            es.indices.create(index="mine", body={
                "settings": {
                    "analysis": {
                        "analyzer": {
                            "default": {
                                "type": "custom",
                                # "tokenize_on_chars": ["_"],  # i want this to work with the standard tokenizer without using char group
                                "tokenizer": "standard",
                                "filter": ["lowercase"]
                            }
                        }
                    },
                }
            })
            res = es.indices.analyze(index="mine", body={
                "field": "text",
                "text": "the quick brown_fox_has to be split"
            })

Use normalizer and define it along with your preferred standard tokenizer

POST /_analyze

{
  "char_filter": {
      "type": "mapping",
      "mappings": [
          "_ =>\\u0020" // replace underscore with whitespace
      ]
  },
  "tokenizer": "standard",
  "text": "the quick brown_fox_has to be split"
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM