
[英]How to Transfer a BigQuery view to a Google Cloud Storage bucket as a csv file
[英]How to export Elasticsearch Index as CSV file to Google Cloud Storage Using Logstash
我正在使用 ElasticSearch,在这里我们正在创建日间索引,并且每分钟都在摄取大量数据。 想将每天创建的索引中的几个字段导出到 Google 云存储。 我能够通过 output 文件实现这一点,如 json,如下所示:
input {
elasticsearch {
hosts => "localhost:9200"
index => "test"
query => '
{
"_source": ["field1","field2"],
"query": {
"match_all": {}
}
filter {
mutate {
rename => {
"field1" => "test1"
"field2" => "test2"
}
}
}
}
'
}
}
output {
google_cloud_storage {
codec => csv {
include_headers => true
columns => [ "test1", "test2" ]
}
bucket => "bucketName"
json_key_file => "creds.json"
temp_directory => "/tmp"
log_file_prefix => "logstash_gcs"
max_file_size_kbytes => 1024
date_pattern => "%Y-%m-%dT%H:00"
flush_interval_secs => 600
gzip => false
uploader_interval_secs => 600
include_uuid => true
include_hostname => true
}
}
但是如何将其导出为 CSV 文件并将其发送到 Google Cloud Storage
您应该能够将output_format
更改为plain
,但此设置将被弃用
您应该删除output_format
并改用codec
设置,它支持csv
output 格式
google_cloud_storage {
...
codec => csv {
include_headers => true
columns => [ "field1", "field2" ]
}
}
如果你想重命名你的字段,你可以添加一个filter
部分并根据你的喜好mutate/rename
字段。 确保还更改 csv 编解码器 output 中的columns
设置:
filter {
mutate {
rename => {
"field1" => "renamed1"
"field2" => "renamed2"
}
}
}
output {
...
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.