简体   繁体   中英

Elasticsearch update mapping using json

I want to change the type of a field from 'string' to 'date' (format: 'epoch_second' to be specific). Since there is no way to update the mappings of an existing index, I need to make a new index for which I mostly want to use the mapping from my existing index. Here is what I am using:

curl -XGET ' http://localhost:9200/sam/saga/_mapping?pretty ' >saga.json

to dump the mapping of the current index into a json file, the content of which is this:

{
  "sam" : {
    "mappings" : {
      "saga" : {
        "properties" : {
          "name" : {
            "type" : "long"
          }
        }
      }
    }
  }
}

then I replace

         "name" : {
           "type" : "long"
         }

with

         "name" : {
           "type" : "date"
         }

and save the new file as saga2.json. then run this

curl -XPUT ' http://localhost:9200/sam/_mapping/saga2 ' -d @saga2.json

However, when I check the mapping of the new index, all types have changed to "string" now.

I even have this problem using the Elasticsearch's example.

Does anyone know what is wrong?

You need to make one more change in your saga2.json file, namely the mapping type name saga -> saga2 (Now you probably need to rename it all to saga3

{
  "sam" : {
    "mappings" : {
      "saga2" : {                  <--- here
        "properties" : {
          "name" : {
            "type" : "date"        <--- and here
          }
        }
      }
    }
  }
}

Then only you can run this:

curl -XPUT 'http://localhost:9200/sam/_mapping/saga2' -d @saga2.json

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM