简体   繁体   中英

Logstash not reading from file input

I'm running this implementation of the ELK stack, which is pretty straightforward and easy to configure.

I can push TCP input through the stack using netcat like so:

nc localhost 5000 < /Users/me/path/to/logs/appOne.log
nc localhost 5000 < /Users/me/path/to/logs/appOneStackTrace.log
nc localhost 5000 < /Users/me/path/to/logs/appTwo.log
nc localhost 5000 < /Users/me/path/to/logs/appTwoStackTrace.log

But I cannot get the Logstash to read the file paths I specify in the config:

input {
        tcp {
        port => 5000
    }

        file {
                path => [
                    "/Users/me/path/to/logs/appOne.log",
                    "/Users/me/path/to/logs/appOneStackTrace.log",
                    "/Users/me/path/to/logs/appTwo.log",
                    "/Users/me/path/to/logs/appTwoStackTrace.log"
                ]
                type => "log"
                start_position => "beginning"
        }
}

output {
    elasticsearch {
        hosts => "elasticsearch:9200"
    }
}

Here is the startup output from the stack regarding logstash input:

logstash_1       | [2019-01-28T17:44:33,206][INFO ][logstash.inputs.tcp      ] Starting tcp input listener {:address=>"0.0.0.0:5000", :ssl_enable=>"false"}
logstash_1       | [2019-01-28T17:44:34,037][INFO ][logstash.inputs.file     ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_a1605b28f1bc77daf785a8805c32f578", :path=>["/Users/me/path/to/logs/appOne.log", "/Users/me/path/to/logs/appOneStackTrace.log", "/Users/me/path/to/logs/appTwo.log", "/Users/me/path/to/logs/appTwoStackTrace.log"]}

There is no indication the pipeline has any issues starting.

I've also checked that the log files have been updated since the display of TCP input, and they have. The last Logstash-specific log from the ELK stack comes from either startup or the TCP input.

Here is my entire Logstash start-up logging in case that's helpful:

logstash_1       | Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
logstash_1       | [2019-01-29T13:32:19,391][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
logstash_1       | [2019-01-29T13:32:19,415][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.5.4"}
logstash_1       | [2019-01-29T13:32:23,989][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
logstash_1       | [2019-01-29T13:32:24,648][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
logstash_1       | [2019-01-29T13:32:24,908][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
logstash_1       | [2019-01-29T13:32:25,046][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
logstash_1       | [2019-01-29T13:32:25,051][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
logstash_1       | [2019-01-29T13:32:25,108][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch:9200"]}
logstash_1       | [2019-01-29T13:32:25,229][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
logstash_1       | [2019-01-29T13:32:25,276][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
logstash_1       | [2019-01-29T13:32:25,327][INFO ][logstash.inputs.tcp      ] Starting tcp input listener {:address=>"0.0.0.0:5000", :ssl_enable=>"false"}
logstash_1       | [2019-01-29T13:32:25,924][INFO ][logstash.inputs.file     ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_143c07d174c46eeab78b902edb3b1289", :path=>["/Users/me/path/to/logs/appOne.log", "/Users/me/path/to/logs/appOneStackTrace.log", "/Users/me/path/to/logs/appTwo.log", "/Users/me/path/to/logs/appTwoStackTrace.log"]}
logstash_1       | [2019-01-29T13:32:25,976][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x4d1515ce run>"}
logstash_1       | [2019-01-29T13:32:26,088][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
logstash_1       | [2019-01-29T13:32:26,106][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
logstash_1       | [2019-01-29T13:32:26,432][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

I found the issue - I needed to map the log files from the host into the container(Docker noob). The local paths I was specifying in the Logstash config were fine for TCP, but were unavailable inside the container without volume mapping.

First, I created the container's internal log directories in the Dockerfile for Logstash:

RUN mkdir /usr/share/appOneLogs
RUN mkdir /usr/share/appTwoLogs

Then I volume-mapped my host's log directories into them in the docker-elk/docker-compose.yml file where Logstash is configured:

logstash:
    build:
      context: logstash/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:ro
      - ./logstash/pipeline:/usr/share/logstash/pipeline:ro
      - /Users/me/path/to/appOne/logs:/usr/share/appOneLogs # this bit
      - /Users/me/path/to/appTwo/logs:/usr/share/appTwoLogs # and this bit
    ports:
      - "5000:5000"
    ...

Finally, I replaced the paths in logstash/pipelines/logstash.config with the directories created in the Dockerfile:

file {
                path => [
                    "/usr/share/appOneLogs",
                    "/usr/share/appTwoLogs",
                ]
        }

Also of note, I removed start_position => "beginning" from the file input definition as this overrides the default behavior to treat files like live streams and thus start at the end .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM