简体   繁体   中英

can't get logstash to create/populate an index in elastic search

here's what i'm using:

CentOS 6.5 x64
Logstash 1.5.0
Elasticsearch 1.5.2
Marvel

all newly installed.

here's my logstash config file:

input {

file {

    path => "/home/datafeeds/public_html/myfeeds/datafeeds/completed/*.csv"
    }

}


filter {

csv {


    columns => ["z_md5","z_product_name_slug","z_country_code","z_product_name","z_expiry_timestamp","z_expiry_human","z_product_name_gb","z_description_gb","z_product_name_and_description_gb_md5","z_product_name_fr","z_description_fr","z_product_name_and_description_fr_md5","z_product_name_de","z_description_de","z_product_name_and_description_de_md5","z_product_name_it","z_description_it","z_product_name_and_description_it_md5","z_product_name_es","z_description_es","z_product_name_and_description_es_md5","feed_ID","feed_product_ID","feed_merchant_product_id","feed_merchant_category","feed_deep_link","feed_merchant_image_url","feed_search_price","feed_description","feed_product_name","feed_merchant_deep_link","feed_image_url","feed_merchant_name","feed_merchant_id","feed_category_name","feed_category_id","feed_delivery_cost","feed_currency","feed_store_price","feed_display_price","feed_data_feed_id","feed_rrp_price","feed_specifications","feed_condition","feed_promotional_text","feed_warranty","feed_merchant_thumb_url","feed_thumb_url","feed_brand_name","feed_brand_id","feed_delivery_time","feed_valid_from","feed_valid_to","feed_web_offer","feed_pre_order","feed_in_stock","feed_stock_quantity","feed_is_for_sale","feed_product_type","feed_commission_group","feed_upc","feed_ean","feed_mpn","feed_isbn","feed_model_number","feed_parent_product_id","feed_language","feed_last_updated","feed_dimensions","feed_colour","feed_keyword","feed_custom_1","feed_custom_2","feed_custom_3","feed_custom_4","feed_custom_5","feed_saving","feed_delivery_weight","feed_delivery_restrictions","feed_reviews","feed_average_rating","feed_number_stars","feed_number_available","feed_rating","feed_alternate_image","feed_large_image","feed_basket_link"]
    separator => ","

}

}



output {

elasticsearch {

    action => "index"
    host => "localhost"
    index => "products_test"
    workers => 1
}
}

i've double checked and there is a csv file in:

/home/datafeeds/public_html/myfeeds/datafeeds/completed/

i've CHMOD 777 the file just make sure it is readable.

i've tested my config file to make sure it's ok

$ sudo /opt/logstash/bin/logstash --configtest -t -f /etc/logstash/conf.d/products.conf
$ Configuration OK

elastic search and logstash are running but no data is passed to elasticsearch or index created.

I've tried creating the index manually but no data is populated

I've tried rebooting the sever, stopping / restarting logstash / elastic search several times.

this is the output from starting logstash in verbose mode:

sudo /opt/logstash/bin/logstash --verbose -f /etc/logstash/conf.d/products.conf
Using version 0.1.x input plugin 'file'. This plugin isn't well supported by the community and likely has no maintainer. {:level=>:info}
Using version 0.1.x codec plugin 'plain'. This plugin isn't well supported by the community and likely has no maintainer. {:level=>:info}
Using version 0.1.x filter plugin 'csv'. This plugin isn't well supported by the community and likely has no maintainer. {:level=>:info}
Using version 0.1.x output plugin 'elasticsearch'. This plugin isn't well supported by the community and likely has no maintainer. {:level=>:info}
Registering file input {:path=>["/home/datafeeds/public_html/shop.z/datafeeds/completed/feed_mydata_data.csv"], :level=>:info}
No sincedb_path set, generating one based on the file path {:sincedb_path=>"/root/.sincedb_fffe878b6c3eb92e792a8cceb1f1f170", :path= ["/home/datafeeds/public_html/shop.z/datafeeds/completed/feed_mydata_data.csv"], :level=>:info}
Jun 06, 2015 1:35:08 PM org.elasticsearch.node.internal.InternalNode <init>
INFO: [logstash-datafeeds-server.server.com-5042-9782] version[1.5.1],   pid[5042], build[5e38401/2015-04-09T13:41:35Z]
Jun 06, 2015 1:35:08 PM org.elasticsearch.node.internal.InternalNode <init>
INFO: [logstash-datafeeds-server.server.com-5042-9782] initializing ...
Jun 06, 2015 1:35:08 PM org.elasticsearch.plugins.PluginsService <init>
INFO: [logstash-datafeeds-server.server.com-5042-9782] loaded [], sites []
Jun 06, 2015 1:35:10 PM org.elasticsearch.node.internal.InternalNode <init>
INFO: [logstash-datafeeds-server.server.com-5042-9782] initialized
Jun 06, 2015 1:35:10 PM org.elasticsearch.node.internal.InternalNode start
INFO: [logstash-datafeeds-server.server.com-5042-9782] starting ...
Jun 06, 2015 1:35:10 PM org.elasticsearch.transport.TransportService doStart
INFO: [logstash-datafeeds-server.server.com-5042-9782] bound_address  {inet[/0:0:0:0:0:0:0:0:9302]}, publish_address {inet[/<ip>:9302]}
Jun 06, 2015 1:35:10 PM org.elasticsearch.discovery.DiscoveryService doStart
INFO: [logstash-datafeeds-server.server.com-5042-9782]  elasticsearch/dvhITquLQBef1eA0pJVuFw
Jun 06, 2015 1:35:13 PM org.elasticsearch.cluster.service.InternalClusterService$UpdateTask run
INFO: [logstash-datafeeds-server.server.com-5042-9782] detected_master [Devil-Slayer][GeNlEqTkQu-UlSOzS7mStw][datafeeds-server.server.com]  [inet[/<ip>:9301]], added {[Devil-Slayer][GeNlEqTkQu-UlSOzS7mStw] [datafeeds-server.server.com][inet[/<ip>:9301]],[logstash-datafeeds-  server.server.com-4935-9782][LtNpHWk6SMGJLDRCXGQKFg][datafeeds- server.server.com][inet[/<ip>:9300]]{data=false, client=true},},  reason: zen-disco-receive(from master [[Devil-Slayer][GeNlEqTkQu-UlSOzS7mStw] [datafeeds-server.server.com][inet[/<ip>:9301]]])
Jun 06, 2015 1:35:13 PM org.elasticsearch.node.internal.InternalNode start
 INFO: [logstash-datafeeds-server.server.com-5042-9782] started
 Automatic template management enabled {:manage_template=>"true", :level=>:info}
 Using mapping template {:template=>{"template"=>"logstash-*", "settings"=> {"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=> {"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=># <Java::JavaUtil::ArrayList:0x6fe472ee>, "properties"=>{"@version"=> {"type"=>"string", "index"=>"not_analyzed"}, "geoip"=>{"type"=>"object",  "dynamic"=>true, "properties"=>{"location"=>{"type"=>"geo_point"}}}}}}},  :level=>:info}
New Elasticsearch output {:cluster=>nil, :host=>["127.0.0.1"], :port=>"9300-9305", :embedded=>false, :protocol=>"node", :level=>:info}
Pipeline started {:level=>:info}
Logstash startup completed

How do I get logstash to populate elasticsearch?

am I missing something?

When logstash newly discovers files, it starts at the end of them. You need to add in start_position if you want it to start at the beginning:

file {
  path => "/home/datafeeds/public_html/myfeeds/datafeeds/completed/*.csv"
  start_position => "beginning"
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM