简体   繁体   中英

How to set field in Logstash as “not_analyzed” using Logstash config file

I have an elasticsearch index which I am using to index a set of documents.

These documents are originally in csv format and I am looking parse these using logstash.

My Logstash config file is.

    input {
        file {
                path => "/csv_files_for_logstash/app1lg.csv"
                        type => "core2"
                        start_position => "beginning"
        }   }


    filter {
        csv {
                separator => ","
                        columns=> ["Date","Package Name","App Version Code","Current Device Installs","Daily Device Installs","Daily Device Uninstalls","Daily Device Upgrades","Current User Installs","Total User Installs","Daily User Installs","Daily User Uninstalls"]
        }
        mutate {convert => ["App Version Code", "string"]}
        mutate {convert => ["Current Device Installs", "float"]}
        mutate {convert => ["Daily Device Installs", "float"]}
        mutate {convert => ["Daily Device Uninstalls", "float"]}
        mutate {convert => ["Current User Installs", "float"]}
        mutate {convert => ["Total User Installs", "float"]}
        mutate {convert => ["Daily User Installs", "float"]}
        mutate {convert => ["Daily User Uninstalls", "float"]}
        ruby {
                code => '
                  b = event["App Version Code"]
                  string2=""
                  for counter in (3..(b.size-1))
                         if counter == 4
                                 string2+= "."+ b[counter]
                         elsif counter ==  6
                                string2+= "("+b[counter]
                         elsif counter == 8
                                string2+= b[counter] + ")"
                         else
                                 string2+= b[counter]
                         end

                   end

                   event["App Version Code"] = string2

                  '

        }
}
   output {
        elasticsearch {
                embedded => true
                        action => "index"
                        host => "es"
                        index => "fivetry"
                        workers => 1

        }
        stdout{
                codec => rubydebug {
                }
        }
}

Now My field value( App Version Code) looks like "123456789" in csv which I am parsing to "4.56(789)" using Ruby code.

This Term breaks it into different values as this is not Analysed.

I know there is other way to create mapping and set it as not_ananlysed , but I don't know how to make that So,

Is there any way to set this not_analysed using just my logstash config file?

Also,

In Kibana there is no .raw field coming with which I can use the exact string.

Thanks and Regards,

You cannot set the mapping via your Logstash configuration. Mapping is not related to Logstash but only to Elasticsearch.

You will need to pre-map these fields in Elasticsearch prior to inserting these documents, you can either create the index, then use the mapping API to set the mapping, or you could use index templates to do so which will allow you to create the mapping without initially creating the index.

Logstash provides a default template to use for new indices. You could edit this file, but it's not a good idea (it will be overwritten on an upgrade, etc).

The elasticsearch{} output allows you to specify your own template to use instead of the default one.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM