I'm having trouble using logstash to bring in the following raw data to elasticsearch. Abstracted the raw data below, was hoping the JSON plugin worked but it currently does not. I've viewed other posts regarding json to no avail.
{
"offset": "stuff",
"results": [
{
"key": "value",
"key1": null,
"key2": null,
"key3": "true",
"key4": "value4",
"key4": [],
"key5": value5,
"key6": "value6",
"key7": "value7",
"key8": value8,
"key9": "value9",
"key10": null,
"key11": null,
"key12": "value12",
"key13": "value13",
"key14": [],
"key15": "key15",
"key16": "value16",
"key17": "value17",
"key18": "value18",
"key19": "value19"
},
{
"key20": "value20",
"key21": null,
"key22": null,
"key23": "value23",
"key24": "value24",
<etc.>
My current conf file:
input {
file {
codec => multiline
{
pattern => '^\{'
negate => true
what => previous
}
#type => "json"
path => <my path>
sincedb_path => "/dev/null"
start_position => "beginning"
}
}
#filter
#{
# json {
# source => message
# remove_field => message
# }
#}
filter
{
mutate
{
replace => [ "message", "%{message}}" ]
gsub => [ 'message','\n','']
}
if [message] =~ /^{.*}$/
{
json { source => message }
}
}
output {
#stdout { codec => rubydebug }
stdout { codec => json }
}
I get a long error that I can't read since it's full of " \\"key10\\": null,\\r \\"key11\\": \\"value11\\",\\r
etc.
Does anyone know what I'm doing wrong or how to better see my error? This is valid json but maybe I'm using my regex for multiline codec wrong.
Can you use a different input plugin than file? Parsing a JSON file as a multiline may be problematic. If possible use a plugin with a JSON codec.
In the file input, you can set a real sincedb_path where logstash can write
In the line where you replace message you have one curly bracket } too many
replace => [ "message", "%{message}}" ]
I would write the output to elasticsearch instead of stdout, but ofcourse for testing you don't have to, but when you write the output to elasticsearch you can see the index being created and use kibana to discover if they the content is to your liking.
output {
elasticsearch {
hosts => "localhost"
index => "stuff-%{+xxxx.ww}"
}
}
I use these curl commands to read from the elasticsearch,
curl -s -XGET 'http://localhost:9200/_cat/indices?v&pretty'
and
curl -s -XGET 'http://localhost:9200/stuff*/_search?pretty=true'
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.