简体   繁体   中英

Mapping geo_point data when importing data to AWS Elasticsearch

I have a set of data inside dynamodb that I am importing to AWS Elasticsearch using this tutorial: https://medium.com/@vladyslavhoncharenko/how-to-index-new-and-existing-amazon-dynamodb-content-with-amazon-elasticsearch-service-30c1bbc91365

I need to change the mapping of a part of that data to geo_point.

I have tried creating the mapping before importing the data with:

PUT user
{
  "mappings": {
    "_doc": {
      "properties": {
        "grower_location": {
            "type": "geo_point"
        }
      }
    }
  }
}

When I do this the data doesn't import, although I don't receive an error.

If I import the data first I am able to search it, although the grower_location: { lat: #, lon: # } object is mapped as an integer and I am unable to run geo_distance.

Please help.

I was able to fix this by importing the data once with the python script in the tutorial.

Then running

GET user/_mappings

Copying the auto generated mappings to clipboard, then,

DELETE user/

Then pasting the copied mapping to a new mapping and changing the type for the geo_point data.

PUT user/
{
   "mappings": {
     "user_type": {
       "properties": {
         ...
         "grower_location": {
           "type": "geo_point"
         }
         ...
       }
     }
   }
}

Then re-importing the data using the python script in the tutorial. Everything is imported and ready to be searched using geo_point!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM