简体   繁体   English

无法获取logstash在弹性搜索中创建/填充索引

[英]can't get logstash to create/populate an index in elastic search

here's what i'm using: 这是我正在使用的:

CentOS 6.5 x64
Logstash 1.5.0
Elasticsearch 1.5.2
Marvel

all newly installed. 所有新安装的。

here's my logstash config file: 这是我的logstash配置文件:

input {

file {

    path => "/home/datafeeds/public_html/myfeeds/datafeeds/completed/*.csv"
    }

}


filter {

csv {


    columns => ["z_md5","z_product_name_slug","z_country_code","z_product_name","z_expiry_timestamp","z_expiry_human","z_product_name_gb","z_description_gb","z_product_name_and_description_gb_md5","z_product_name_fr","z_description_fr","z_product_name_and_description_fr_md5","z_product_name_de","z_description_de","z_product_name_and_description_de_md5","z_product_name_it","z_description_it","z_product_name_and_description_it_md5","z_product_name_es","z_description_es","z_product_name_and_description_es_md5","feed_ID","feed_product_ID","feed_merchant_product_id","feed_merchant_category","feed_deep_link","feed_merchant_image_url","feed_search_price","feed_description","feed_product_name","feed_merchant_deep_link","feed_image_url","feed_merchant_name","feed_merchant_id","feed_category_name","feed_category_id","feed_delivery_cost","feed_currency","feed_store_price","feed_display_price","feed_data_feed_id","feed_rrp_price","feed_specifications","feed_condition","feed_promotional_text","feed_warranty","feed_merchant_thumb_url","feed_thumb_url","feed_brand_name","feed_brand_id","feed_delivery_time","feed_valid_from","feed_valid_to","feed_web_offer","feed_pre_order","feed_in_stock","feed_stock_quantity","feed_is_for_sale","feed_product_type","feed_commission_group","feed_upc","feed_ean","feed_mpn","feed_isbn","feed_model_number","feed_parent_product_id","feed_language","feed_last_updated","feed_dimensions","feed_colour","feed_keyword","feed_custom_1","feed_custom_2","feed_custom_3","feed_custom_4","feed_custom_5","feed_saving","feed_delivery_weight","feed_delivery_restrictions","feed_reviews","feed_average_rating","feed_number_stars","feed_number_available","feed_rating","feed_alternate_image","feed_large_image","feed_basket_link"]
    separator => ","

}

}



output {

elasticsearch {

    action => "index"
    host => "localhost"
    index => "products_test"
    workers => 1
}
}

i've double checked and there is a csv file in: 我已经仔细检查过,并在其中找到一个csv文件:

/home/datafeeds/public_html/myfeeds/datafeeds/completed/

i've CHMOD 777 the file just make sure it is readable. 我已将文件CHMOD 777确定为可读性。

i've tested my config file to make sure it's ok 我已经测试过我的配置文件,以确保一切正常

$ sudo /opt/logstash/bin/logstash --configtest -t -f /etc/logstash/conf.d/products.conf
$ Configuration OK

elastic search and logstash are running but no data is passed to elasticsearch or index created. 弹性搜索和logstash正在运行,但是没有数据传递给elasticsearch或创建的索引。

I've tried creating the index manually but no data is populated 我尝试手动创建索引,但未填充任何数据

I've tried rebooting the sever, stopping / restarting logstash / elastic search several times. 我尝试重新启动服务器,多次停止/重新启动logstash /弹性搜索。

this is the output from starting logstash in verbose mode: 这是以详细模式启动logstash的输出:

sudo /opt/logstash/bin/logstash --verbose -f /etc/logstash/conf.d/products.conf
Using version 0.1.x input plugin 'file'. This plugin isn't well supported by the community and likely has no maintainer. {:level=>:info}
Using version 0.1.x codec plugin 'plain'. This plugin isn't well supported by the community and likely has no maintainer. {:level=>:info}
Using version 0.1.x filter plugin 'csv'. This plugin isn't well supported by the community and likely has no maintainer. {:level=>:info}
Using version 0.1.x output plugin 'elasticsearch'. This plugin isn't well supported by the community and likely has no maintainer. {:level=>:info}
Registering file input {:path=>["/home/datafeeds/public_html/shop.z/datafeeds/completed/feed_mydata_data.csv"], :level=>:info}
No sincedb_path set, generating one based on the file path {:sincedb_path=>"/root/.sincedb_fffe878b6c3eb92e792a8cceb1f1f170", :path= ["/home/datafeeds/public_html/shop.z/datafeeds/completed/feed_mydata_data.csv"], :level=>:info}
Jun 06, 2015 1:35:08 PM org.elasticsearch.node.internal.InternalNode <init>
INFO: [logstash-datafeeds-server.server.com-5042-9782] version[1.5.1],   pid[5042], build[5e38401/2015-04-09T13:41:35Z]
Jun 06, 2015 1:35:08 PM org.elasticsearch.node.internal.InternalNode <init>
INFO: [logstash-datafeeds-server.server.com-5042-9782] initializing ...
Jun 06, 2015 1:35:08 PM org.elasticsearch.plugins.PluginsService <init>
INFO: [logstash-datafeeds-server.server.com-5042-9782] loaded [], sites []
Jun 06, 2015 1:35:10 PM org.elasticsearch.node.internal.InternalNode <init>
INFO: [logstash-datafeeds-server.server.com-5042-9782] initialized
Jun 06, 2015 1:35:10 PM org.elasticsearch.node.internal.InternalNode start
INFO: [logstash-datafeeds-server.server.com-5042-9782] starting ...
Jun 06, 2015 1:35:10 PM org.elasticsearch.transport.TransportService doStart
INFO: [logstash-datafeeds-server.server.com-5042-9782] bound_address  {inet[/0:0:0:0:0:0:0:0:9302]}, publish_address {inet[/<ip>:9302]}
Jun 06, 2015 1:35:10 PM org.elasticsearch.discovery.DiscoveryService doStart
INFO: [logstash-datafeeds-server.server.com-5042-9782]  elasticsearch/dvhITquLQBef1eA0pJVuFw
Jun 06, 2015 1:35:13 PM org.elasticsearch.cluster.service.InternalClusterService$UpdateTask run
INFO: [logstash-datafeeds-server.server.com-5042-9782] detected_master [Devil-Slayer][GeNlEqTkQu-UlSOzS7mStw][datafeeds-server.server.com]  [inet[/<ip>:9301]], added {[Devil-Slayer][GeNlEqTkQu-UlSOzS7mStw] [datafeeds-server.server.com][inet[/<ip>:9301]],[logstash-datafeeds-  server.server.com-4935-9782][LtNpHWk6SMGJLDRCXGQKFg][datafeeds- server.server.com][inet[/<ip>:9300]]{data=false, client=true},},  reason: zen-disco-receive(from master [[Devil-Slayer][GeNlEqTkQu-UlSOzS7mStw] [datafeeds-server.server.com][inet[/<ip>:9301]]])
Jun 06, 2015 1:35:13 PM org.elasticsearch.node.internal.InternalNode start
 INFO: [logstash-datafeeds-server.server.com-5042-9782] started
 Automatic template management enabled {:manage_template=>"true", :level=>:info}
 Using mapping template {:template=>{"template"=>"logstash-*", "settings"=> {"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=> {"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=># <Java::JavaUtil::ArrayList:0x6fe472ee>, "properties"=>{"@version"=> {"type"=>"string", "index"=>"not_analyzed"}, "geoip"=>{"type"=>"object",  "dynamic"=>true, "properties"=>{"location"=>{"type"=>"geo_point"}}}}}}},  :level=>:info}
New Elasticsearch output {:cluster=>nil, :host=>["127.0.0.1"], :port=>"9300-9305", :embedded=>false, :protocol=>"node", :level=>:info}
Pipeline started {:level=>:info}
Logstash startup completed

How do I get logstash to populate elasticsearch? 如何获取logstash填充elasticsearch?

am I missing something? 我错过了什么吗?

When logstash newly discovers files, it starts at the end of them. 当logstash新发现文件时,它从文件末尾开始。 You need to add in start_position if you want it to start at the beginning: 如果要从头开始,则需要添加start_position

file {
  path => "/home/datafeeds/public_html/myfeeds/datafeeds/completed/*.csv"
  start_position => "beginning"
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM