简体   繁体   中英

Incremental indexing using Logstash from SQL Server to Elasticsearch

Hi All I'am getting below error while running my logstash to index record from SQLserver database to elasticsearch

Java Version : 13
Logstash : 7.2.0
Elastic search :  7.2.0
JDBC Driver :  mssql-jdbc-7.4.1.jre12-shaded

MyConfig file

input {
jdbc {
jdbc_driver_library => "C:\Users\Mohan\Downloads\MsoftDriver\sqljdbc_7.4\enu\mssql-jdbc-7.4.1.jre12-shaded.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://DESKTOP-VHDBS6E\SQLEXPRESS:1433;databaseName=mohan;"
jdbc_user => "mohan"
jdbc_password => "mohan"
statement => "SELECT * FROM [User_Name]"
start_position=> "beginning"
}
}
filter {
}

output {
elasticsearch {
hosts => "http://localhost:9200"

index => "Employee"

}
stdout { codec => rubydebug }
}

Getting Below error

 Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["C:/Users/Mohan/Desktop/EsearchKibana/logstash-7.2.0/logstash-core/lib/logstash/config/mixin.rb:87:in `config_init'", "C:/Users/Mohan/Desktop/EsearchKibana/logstash-7.2.0/logstash-core/lib/logstash/inputs/base.rb:60:in `initialize'", "org/logstash/plugins/PluginFactoryExt.java:258:in `plugin'", "org/logstash/plugins/PluginFactoryExt.java:120:in `buildInput'", "org/logstash/execution/JavaBasePipelineExt.java:50:in `initialize'", "C:/Users/Mohan/Desktop/EsearchKibana/logstash-7.2.0/logstash-core/lib/logstash/java_pipeline.rb:24:in `initialize'", "C:/Users/Mohan/Desktop/EsearchKibana/logstash-7.2.0/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "C:/Users/Mohan/Desktop/EsearchKibana/logstash-7.2.0/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}

Could anyone please help me to resolve this issue?

The error is because of start_position=> "beginning". please comment it and try once.

Well, there is no such setting "start_position" for the jdbc-input plugin. Logstash can't recognize this setting and thus fails. Take a look at the doc - you won't find it there.

Also, be aware that there is a bug in the jdbc-input-plugin with Java version 9 onwards (this would be the next issue with your configuration I guess). So please refer to this github issue and this workaround when logstash fails because it can't load the jdbc driver.

Hi all above issue is resolved for me i hope it is due to wrong JDBC driver. I have downloaded the JDBC client directly from below link instead of Microsoft and placed the file in

\\logstash-7.2.0\\logstash-core\\lib\\jars

https://www.elastic.co/downloads/past-releases/jdbc-client-7-2-0

Note : My logstash version is 7.2.0

logstash config worked for incrementel indexing approach

input {
jdbc {
jdbc_driver_library => ""
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://DESKTOP-VHDBS6E\SQLEXPRESS:1433;databaseName=xxxxx;"
jdbc_user => "xxxxx"
jdbc_password => ""xxxx
jdbc_paging_enabled => true
tracking_column => modified_date
use_column_value => true
clean_run => true
tracking_column_type => "timestamp"
schedule => "*/2 * * * * *"
statement => "SELECT * from [User_Name] where modified_date >:sql_last_value"
}
}

filter {
 mutate {
   remove_field => ["@version","@timestamp"]
 }
}

output {
elasticsearch {
hosts => "http://localhost:9200"
index => "employee"
document_type => "_doc"
document_id => "%{id}"

}
stdout { codec => rubydebug }
}

I hope this one help for those who is trying to process incremental indexing to Elasticsearch using logstash

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM