简体   繁体   中英

How to index a csv document in elasticsearch?

I'm trying to upload some csv files in elastic search. I don't want to mess it up, so I'm writing for some guidance. Can someone help with a video/tutorial/documentation, of how to index a document in elastic search? I've read the official documentation, but I feel a bit lost as a begginer. It will be fine If you'll recommend me a video tutorial, or you'll describe me some steps. Hope you are all doing well! Thank you for your time !

The best way is to use Logstash, which is official and very fast pipeline for elastic,you can download it from here

First of all create a configuration file as below example and save it as logstashExample.conf in bin directory of logstash. With assuming that elastic server and kibana console are up and running, run the configuration file with this command "./logstash -f logstashExample.conf".

I've also added a suitable example of related configuration file for Logstash, please change the index name in output and your file path in input with respect of your need, you can also disable filtering by removing csv components in below example.

 input { file { path => "/home/timo/bitcoin-data/*.csv" start_position => "beginning" sincedb_path => "/dev/null" } } filter { csv { separator => "," #Date,Open,High,Low,Close,Volume (BTC),Volume (Currency),Weighted Price columns => ["Date","Open","High","Low","Close","Volume (BTC)", "Volume (Currency)","Weighted Price"] } } output { elasticsearch { hosts => "http://localhost:9200" index => "bitcoin-prices" } stdout {} }

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM