简体   繁体   English

无法从 TensorFlow 或 Keras 中的 Google Cloud Storage 存储桶加载图像

[英]Unable to load images from a Google Cloud Storage bucket in TensorFlow or Keras

I have a bucket on Google Cloud Storage that contains images for a TensorFlow model training.我在 Google Cloud Storage 上有一个存储桶,其中包含 TensorFlow model 训练的图像。 I'm using tensorflow_cloud to load the images stored in the bucket called stereo-train and the full URL to the directory with images is:我正在使用tensorflow_cloud将存储在名为stereo-train的存储桶中的图像和完整的 URL 加载到包含图像的目录中:

gs://stereo-train/data_scene_flow/training/dat

But using this path in the tf.keras.preprocessing.image_dataset_from_directory function, I get the error in the log in Google Cloud Console:但是在tf.keras.preprocessing.image_dataset_from_directory function 中使用此路径,我在 Google Cloud Console 的日志中收到错误消息:

FileNotFoundError: [Errno 2] No such file or directory: 'gs://stereo-train/data_scene_flow/training/dat'

How to fix this?如何解决这个问题?

Code:代码:

GCP_BUCKET = "stereo-train"

kitti_dir = os.path.join("gs://", GCP_BUCKET, "data_scene_flow")
kitti_training_dir = os.path.join(kitti_dir, "training", "dat")

ds = tf.keras.preprocessing.image_dataset_from_directory(kitti_training_dir, image_size=(375,1242), batch_size=batch_size, shuffle=False, label_mode=None)


Even when I use the following, it doesn't work:即使我使用以下内容,它也不起作用:


filenames = np.sort(np.asarray(os.listdir(kitti_train))).tolist()
# Make a Dataset of image tensors by reading and decoding the files.
ds = list(map(lambda x: tf.io.decode_image(tf.io.read_file(kitti_train + x)), filenames))

tf.io.read_file instead of the keras function, I get the same error. tf.io.read_file而不是 keras function,我得到了同样的错误。 How to fix this?如何解决这个问题?

If you are using Linux or OSX you can use Google Cloud Storage FUSE which will allow you to mount your bucket locally and use it like any other file system.如果您使用 Linux 或 OSX,您可以使用Google Cloud Storage FUSE ,它允许您在本地挂载您的存储桶并像使用任何其他文件系统一样使用它。 Follow the installation guide and then mount your bucket somewhere on your system, ie.:按照安装指南,然后将存储桶安装在系统上的某个位置,即:

mkdir /mnt/buckets
gcsfuse gs://stereo-train /mnt/buckets

Then you should be able to use the paths from the mount point in your code and load the content from the bucket in Keras.然后,您应该能够在代码中使用挂载点的路径,并从 Keras 的存储桶中加载内容。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何将预训练的 Tensorflow model 从 Google Cloud Storage 加载到 Datalab - How to load pretrained Tensorflow model from Google Cloud Storage into Datalab 从 Google Cloud Storage 存储桶复制到 S3 存储桶 - Copy from Google Cloud Storage Bucket to S3 Bucket 无法从在数据proc群集上运行的jupyter笔记本中的Google云存储桶中读取文件 - Unable to read files from google cloud storage bucket from jupyter notebook running on data proc cluster 如何从 Google Cloud Storage 存储桶加载保存在 joblib 文件中的模型 - How to load a model saved in joblib file from Google Cloud Storage bucket 无法读取谷歌云存储桶上传的csv文件 - Unable to read csv file uploaded on google cloud storage bucket 如何直接从云存储桶加载 tf.keras model? - How to load tf.keras model directly from cloud bucket? 将在 Google Cloud AI 平台上训练的 TensorFlow model 保存到 Google Cloud Storage Bucket 时,没有此类 object 错误 - No such object error when saving TensorFlow model trained on Google Cloud AI Platform to a Google Cloud Storage Bucket 是否可以使用 App Engine 通过 Google Cloud Platform 上的本地路径从云存储桶对象加载文件? - Is it possible to load a file from a cloud storage bucket object by its local path on Google Cloud Platform using App Engine? 如何从谷歌云存储桶下载对象? - How to download an object from google cloud storage bucket? 如何从AI Platform作业访问Google Cloud Storage Bucket - How to access Google Cloud Storage Bucket from AI Platform job
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM