[英]Input data from CSV file to logstash
我有一個csv文件,包含以下標題:
"PacketId","MACAddress","Date","PacketLength","SourceIP","SourcePort","DestIP","DestPort"
我想使用LogStash將數據索引到ElasticSearch,並且無法為其編寫過濾器。
filter {
grok {
match => message => "%{IP:SourceIP}"
}
}
上面的過濾器提供了一個很好的SourceIP字段提取,但是我如何編寫grok模式來為所有字段提取它。
讓以下CSV文件:
1,00-14-22-01-23-45,13/09/2015,32,128.248.1.43,9980,128.248.23.13,9880
1,01-74-02-84-13-98,14/09/2015,64,128.248.1.94,9280,128.248.13.84,9380
您必須在此處設置Logstash配置:
input {
file {
path => "/path/of/your/csv/test.csv"
sincedb_path => "/path/of/your/csv/test.idx"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["PacketId","MACAddress","Date","PacketLength","SourceIP","SourcePort","DestIP","DestPort"]
}
}
output {
stdout {
codec => rubydebug
}
}
您將獲得輸出結果:
{
"message" => [
[0] "1,00-14-22-01-23-45,13/09/2015,32,128.248.1.43,9980,128.248.23.13,9880"
],
"@version" => "1",
"@timestamp" => "2015-09-14T20:11:28.976Z",
"host" => "MyHost.local",
"path" => "/path/of/your/csv/test.csv",
"PacketId" => "1",
"MACAddress" => "00-14-22-01-23-45",
"Date" => "13/09/2015",
"PacketLength" => "32",
"SourceIP" => "128.248.1.43",
"SourcePort" => "9980",
"DestIP" => "128.248.23.13",
"DestPort" => "9880"
}
{
"message" => [
[0] "1,01-74-02-84-13-98,14/09/2015,64,128.248.1.94,9280,128.248.13.84,9380"
],
"@version" => "1",
"@timestamp" => "2015-09-14T20:11:28.978Z",
"host" => "MyHost.local",
"path" => "/path/of/your/csv/test.csv",
"PacketId" => "1",
"MACAddress" => "01-74-02-84-13-98",
"Date" => "14/09/2015",
"PacketLength" => "64",
"SourceIP" => "128.248.1.94",
"SourcePort" => "9280",
"DestIP" => "128.248.13.84",
"DestPort" => "9380"
}
此致,阿蘭
您需要先使用CSV過濾器 ,而不是grok。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.