[英]How to use JSON filter plugin to seperate message in JSON in logstash?
This is my logstash conf file.这是我的logstash conf 文件。
input {
http_poller{
urls =>{
urlname =>"http://ivivaanywhere.ivivacloud.com/api/Asset/Asset/All?apikey=SC:demo:64a9aa122143a5db&max=10&last=0"
}
request_timeout =>60
schedule => {every => "20s"}
codec => "line"
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => "http://127.0.0.1:9200"
index => "apilogs1"
}
stdout { codec => rubydebug }
}
I need to separate "message" in JSON in to fields to show in kibana我需要将 JSON 中的“消息”分隔到要在 kibana 中显示的字段中
JSON message is some thing like this JSON 消息是这样的
[{"AssetID":"12341234","AssetCategoryKey":"50","Description":"Test AC Asset","OperationalStatus":"Operational","OperationalStatusChangeComment":"","InstalledLocationKey":"5","Make":"","Model":"","SerialNumber":"","BarCode":"","InstalledDate":"","CommissionedDate":"","Ownership":"","IsMobile":"0","ParentAssetKey":"","PurchasedDate":"","CurrentAmount":"","CurrentDepreciationAmount":"","UpdatedTime":"","PurchasedAmount":"","SalvageValue":"","DisposalDate":"","WarrantyExpiry":"","WarrantyStatus":"0","ClassKey":"","Specification":"","OwnerKey":"0","OwnerType":"","AssigneeAddedDate":"","AssigneeKey":"","AssigneeType":"","IsSold":"0","IsBackup":"0","CurrentLocationKey":"","Manufacturer_VendorKey":"","Supplier_VendorKey":"","EndofUsefullLifeDate":"","Hidden":"0","CreatedDateTime":"20200430:124909","CreatedUserKey":"141","ModifiedDateTime":"","ModifiedUserKey":"","IsLocked":"0","LockedUserKey":"","LockedDateTime":"","AssetKey":"389","ObjectKey":"389","__key__":"389","ObjectID":"12341234","InstalledLocationName":"Singapore.Office","AssetCategoryID":"Access
Modify input codec as "json".将输入编解码器修改为“json”。
input {
http_poller{
urls =>{
urlname =>"http://ivivaanywhere.ivivacloud.com/api/Asset/Asset/All?apikey=SC:demo:64a9aa122143a5db&max=10&last=0"
}
request_timeout =>60
schedule => {every => "20s"}
codec => "json"
}
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.