简体   繁体   中英

Elasticsearch 5.0 batch document ingestion with pipeline

I'm upgrading elasticsearch 2.1 to 5.0. I used a document ingestion plugin for 2.1 which works most excellently with a batch ingest.

For 5.0 I've installed the ingest-attachment in 5.0.

I've created a pipeline:

{
  "attachment": {
  "description": "Attachment ingestion",
  "processors": [
    {
      "attachment": {
        "field": "data"
      }
    }]
  }
}

The problem is, with the previous plugin I was ingesting using bulk, but I can't find in the documentation how to do a bulk ingest whilst utilising a pipeline?

You're right @val, it was a case of rtm!

return new Promise((resolve, reject) => {
  client.bulk({
    body: documentArray,
    pipeline: 'attachment',//this worked
  }, (error, response) => {
    if (error) {
      return reject(error.message);
    }

    resolve(response);
  });
});

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM