I'm using logstash as a pipeline between my postgres database and elasticsearch. I have 2 tables in my postgres database (table user and table project). If I try to input every table on its own in seperate logstash files it works fine, but I want to use only one logstash file to create 2 indices in elasticsearch and input every postgres table data inside an index.
My logstash file to index my project table inside elasticsearch (which works fine and creates index):
input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://localhost:5432/postgres"
jdbc_user => "postgres"
jdbc_password => "firas"
jdbc_driver_library => "C:\logstash-7.5.2\drivers\postgresql-42.2.10.jre6.jar"
jdbc_validate_connection => true
jdbc_driver_class => "org.postgresql.Driver"
statement => "select * from public.project "
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => ["project"]
}
stdout {}
}
Use tags in your inputs and use conditionals in your output to filter by those tags.
Your pipeline should be something like this:
input {
jdbc {
*** connnection configuration for the user table ***
tags => ["user"]
}
jdbc {
*** connnection configuration for the project table ***
tags => ["project"]
}
}
output {
if "user" in [tags] {
elasticsearch {
hosts => "localhost:9200"
index => "user"
}
}
if "project" in [tags] {
elasticsearch {
hosts => "localhost:9200"
index => "project"
}
}
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.