简体   繁体   English

Logstash不会将csv数据加载到弹性搜索

[英]Logstash doesn't load csv data to elastic search

I am not able to load data to an elastic search node on localhost with logstash... I want to have logstash reading csv files and loading those data to elastic search. 我无法使用logstash将数据加载到本地主机上的弹性搜索节点上...我想让logstash读取csv文件并将这些数据加载到弹性搜索中。 But nothing works, I can only read data that I add to elastic search manually, it seems that logstash do nothing. 但是什么都行不通,我只能读取手动添加到弹性搜索中的数据,看来logstash什么也没做。

My logstash config is this: 我的logstash配置是这样的:

input {
    file {
        path => [ "C:\Users\Michele\Downloads\logstash-1.5.3\logstash-1.5.3\Users\*.csv" ]

        start_position => "beginning"
    }
}

filter {
    csv {
        columns => ["timestamp", "impianto", "tipo_misura", "valore","unita_misura"]
        separator => ","
    }
}

output {
    elasticsearch {
        action => "index"
        host => "localhost"
        cluster => "elasticsearch"
        node_name => "NCC-1701-A"
        index => "myindex"
        index_type => "pompe"
        workers => 1
    }
}

My csv file is: 我的csv文件是:

2015-08-03T18:46:00,Abbiategrasso,Pressione gruppo 1,55.5,m
2015-08-03T18:46:10,Abbiategrasso,Pressione gruppo 1,44.4,m
2015-08-03T18:46:20,Abbiategrasso,Pressione gruppo 1,66.6,m
2015-08-03T18:46:30,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:46:40,Abbiategrasso,Pressione gruppo 1,22.2,m
2015-08-03T18:46:50,Abbiategrasso,Pressione gruppo 1,77.7,m
2015-08-03T18:47:00,Abbiategrasso,Pressione gruppo 1,11.1,m
2015-08-03T18:47:10,Abbiategrasso,Pressione gruppo 1,44.4,m
2015-08-03T18:47:20,Abbiategrasso,Pressione gruppo 1,55.5,m
2015-08-03T18:47:30,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:47:40,Abbiategrasso,Pressione gruppo 1,22.2,m
2015-08-03T18:47:50,Abbiategrasso,Pressione gruppo 1,66.6,m
2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:48:10,Abbiategrasso,Pressione gruppo 1,77.7,m
2015-08-03T18:48:20,Abbiategrasso,Pressione gruppo 1,22.2,m
2015-08-03T18:48:30,Abbiategrasso,Pressione gruppo 1,88.8,m
2015-08-03T18:48:40,Abbiategrasso,Pressione gruppo 1,55.5,m
2015-08-03T18:48:50,Abbiategrasso,Pressione gruppo 1,33.3,m
2015-08-03T18:49:00,Abbiategrasso,Pressione gruppo 1,55.5,m

Nothing new appear on "myindex" index! “ myindex”索引上没有新内容! But I don't know why... 但是我不知道为什么

EDIT 编辑

Changed logstash config file from "*.csv" that doesn't get files to "abbiategrasso.csv" 将未获取文件的logstash配置文件从“ * .csv”更改为“ abbiategrasso.csv”

My --debug message on logstash try now, is: 我在logstash上的--debug消息现在尝试,是:

←[33mfailed action with response of 400, dropping action: ["index", {:_id=>nil,
:_index=>"abbiategrasso", :_type=>"pompe", :_routing=>nil}, #<LogStash::Event:0x
1cea7b7 @metadata_accessors=#<LogStash::Util::Accessors:0x1e577ee @store={"path"
=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiate
grasso.csv", "retry_count"=>0}, @lut={"[path]"=>[{"path"=>"C:\\Users\\Michele\\D
ownloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "retry_coun
t"=>0}, "path"]}>, @cancelled=false, @data={"message"=>["2015-08-03T18:48:00,Abb
iategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-0
9-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Download
s\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015
-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo
1", "valore"=>"66.6", "unita_misura"=>"m"}, @metadata={"path"=>"C:\\Users\\Miche
le\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "retry
_count"=>0}, @accessors=#<LogStash::Util::Accessors:0x2ff785 @store={"message"=>
["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"
1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\
\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso
.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_mi
sura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, @lut={"host
"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"
], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-H
P", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\User
s\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiateg
rasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"
m"}, "host"], "path"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione
 gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z",
 "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\l
ogstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "i
mpianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6"
, "unita_misura"=>"m"}, "path"], "message"=>[{"message"=>["2015-08-03T18:48:00,A
bbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015
-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downlo
ads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"20
15-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione grupp
o 1", "valore"=>"66.6", "unita_misura"=>"m"}, "message"], "timestamp"=>[{"messag
e"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version
"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>
"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategr
asso.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tip
o_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, "timest
amp"], "impianto"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gr
uppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "h
ost"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logs
tash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "impi
anto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "
unita_misura"=>"m"}, "impianto"], "tipo_misura"=>[{"message"=>["2015-08-03T18:48
:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>
"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\D
ownloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"
=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione
gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, "tipo_misura"], "valore"=>[{"
message"=>["2015-08-03T18:48:00,Abbiategrasso,Pressione gruppo 1,66.6,m\r"], "@v
ersion"=>"1", "@timestamp"=>"2015-09-07T17:28:32.501Z", "host"=>"Michele-HP", "p
ath"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.3\\logstash-1.5.3\\Users\\abb
iategrasso.csv", "timestamp"=>"2015-08-03T18:48:00", "impianto"=>"Abbiategrasso"
, "tipo_misura"=>"Pressione gruppo 1", "valore"=>"66.6", "unita_misura"=>"m"}, "
valore"], "unita_misura"=>[{"message"=>["2015-08-03T18:48:00,Abbiategrasso,Press
ione gruppo 1,66.6,m\r"], "@version"=>"1", "@timestamp"=>"2015-09-07T17:28:32.50
1Z", "host"=>"Michele-HP", "path"=>"C:\\Users\\Michele\\Downloads\\logstash-1.5.
3\\logstash-1.5.3\\Users\\abbiategrasso.csv", "timestamp"=>"2015-08-03T18:48:00"
, "impianto"=>"Abbiategrasso", "tipo_misura"=>"Pressione gruppo 1", "valore"=>"6
6.6", "unita_misura"=>"m"}, "unita_misura"]}>>] {:level=>:warn, :file=>"/Users/M
ichele/Downloads/logstash-1.5.3/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logs
tash-output-elasticsearch-1.0.5-java/lib/logstash/outputs/elasticsearch.rb", :li
ne=>"531", :method=>"submit"}←[0m

Logstash fail to upload data... Logstash无法上传数据...

ACHIEVED 已达成

Remove current data from your index if you have charged data manually with Marvel.Sense for example... for this use a DELETE command of your index. 如果您已使用Marvel.Sense手动计费了数据,则从索引中删除当前数据,例如,为此,请使用索引的DELETE命令。 Next time logstash will be able to create a new index that meets it's requirements! 下一次,logstash将能够创建满足其要求的新索引!

Also for file definition in logstash config file don't use "*.csv" it doesn't work... it doesnt search all files in the folder, it search for the named file like that :( 另外,对于logstash配置文件中的文件定义,请不要使用“ * .csv”。它不起作用...它不会搜索文件夹中的所有文件,而是搜索命名文件:

My config now is: 我的配置现在是:

input {
    file {
        path => [ "C:\Users\Michele\Downloads\logstash-1.5.3\logstash-1.5.3\Users\abbiategrasso4.csv" ]

        start_position => "beginning"
    }
}

filter {
    csv {
        columns => ["timestamp", "impianto", "tipo_misura", "valore","unita_misura"]
        separator => ","
    }

    mutate {
        convert => { "valore" => "float" }
    }
}

output {
    elasticsearch {
        action => "index"
        host => "localhost"
        cluster => "elasticsearch"
        node_name => "NCC-1701-A"
        index => "abbiategrasso"
        document_type => "pompe"
        workers => 1
    }

    stdout { codec => rubydebug }
}

Now works, I'll try Kibana :) 现在可以使用了,我将尝试Kibana :)

While you are troubleshooting I would start by printing to standard out. 在进行故障排除时,我将从打印标准输出开始。

output { stdout { codec => rubydebug } }

This way you can see if the issue is with logstash or with your interface to elasticsearch. 通过这种方式,您可以查看问题是否出在logstash或与Elasticsearch的接口有关。

I've also found it highly useful to play around with the sincedb when doing file inputs. 我还发现在进行文件输入时,使用sincedb非常有用。 It's possible that while you were trouble shooting you read the data in and the sincedb was modified. 拍摄时可能会遇到麻烦,您可能会读入数据并修改了sincedb。 Add the following line to your file {} while troubleshooting. 在进行故障排除时,将以下行添加到文件{}中 Though don't forget to remove it when you go into actual usage or you will re-ingest data that you don't intend to. 尽管在实际使用时不要忘记将其删除,否则您将重新摄取您不打算使用的数据。

sincedb_path => "/dev/null"

The configuration for your elasticsearch output plugin seems to have an issue, ie index_type should be document_type instead: 您的elasticsearch输出插件的配置似乎有问题,即index_type应该改为document_type

output {
    elasticsearch {
        action => "index"
        host => "localhost"
        cluster => "elasticsearch"
        node_name => "NCC-1701-A"
        index => "myindex"
        document_type => "pompe"        <----- fix this
        workers => 1
    }
}

Also in addition to what bravosierra99 said, you should start your logstash with --debug so you get some TRACE level information on what's happening under the hood. 另外,除了bravosierra99所说的以外,您还应该使用--debug来启动logstash,以便获得一些TRACE级别的有关幕后情况的信息。 If the above fix still doesn't work, I'd suggest you update your question with the debug log output you get from logstash. 如果上述修复仍然无法正常进行,建议您使用从logstash获得的调试日志输出来更新问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM