We need to transfer the csv files from GCP cloud storage to a ftp server, how we can achieve this?
Can we transfer this files directly to ftp with bigquery tools?
Or do we need to download it with a service and then upload to the ftp?
EDIT: Ah... Seems I've missed your Java
tag. Well, if you're okay with spawning sub-processes, then the following solution should work. Else, please consult the supported Java API for GCP. As of today it's this one
If you have ssh access to your ftp server, then simply use the Google Cloud SDK:
gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] [SAVE_TO_LOCAL_FTP_DIR]
Else, the following command pipes data from GCP -> Your local machine -> FTP:
gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] - | curl -T - ftp://user:PASS@ftp-server.com/filename.ext
The first -
after [OBJECT_NAME]
tells gsutil
to write to STDOUT
instead of an ordinary file, while the second -
tells curl -T
to read from STDIN
. For the uninitiated, the |
is a bash pipe indicating to pass STDOUT
of the first program to the STDIN
of the next.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.