[英]Google Colaboratory : OSError: [Errno 5] Input/output error
I am using Google Colaboratory, and mounting Google Drive.我正在使用 Google Colaboratory,并安装 Google Drive。 When I access a csv file, it gets me the following error:
当我访问 csv 文件时,会出现以下错误:
OSError: [Errno 5] Input/output error.
OSError:[Errno 5] 输入/输出错误。
This did not happen before.这在以前没有发生过。
How can I access to the csv file as I used to?如何像以前一样访问 csv 文件?
I have tried this, but did not work:我试过这个,但没有奏效:
Input/output error while using google colab with google drive 将 google colab 与 google drive 一起使用时出现输入/输出错误
This happened after conducting the following code.这发生在执行以下代码之后。
for segment_id in tqdm(range(segment_num)):
with h5py.File(os.path.join(INPUT_PATH, "train.h5"), "r") as f:
train_answers.append(f['time_to_failure'][segment_id*segment_interval + SEGMENT_LENGTH])
The tqdm bar progressed until 37%, and than gave the following error. tqdm 条进度到 37%,然后给出以下错误。
OSError: Unable to open file (file read failed: time = Thu May 2 14:14:09 2019 , filename = './drive/My Drive/Kaggle/LANL-Earthquake-Prediction/input/train.h5', file descriptor = 74, errno = 5, error message = 'Input/output error', buf = 0x7ffc31926d00, total read size = 8, bytes this sub-read = 8, bytes actually read = 18446744073709551615, offset = 0)
OSError:无法打开文件(文件读取失败:时间 = Thu May 2 14:14:09 2019,文件名 = './drive/My Drive/Kaggle/LANL-Earthquake-Prediction/input/train.h5',文件描述符= 74,errno = 5,错误消息 = '输入/输出错误',buf = 0x7ffc31926d00,总读取大小 = 8,此子读取的字节数 = 8,实际读取的字节数 = 18446744073709551615,偏移量 = 0)
Since then, large files like train.csv(9GB), which is on Google Drive cannot be read from Google Colaboratory.从那时起,无法从 Google Colaboratory 读取 Google Drive 上的大型文件(如 train.csv(9GB))。 It gives the following error.
它给出了以下错误。
OSError: [Errno 5] Input/output error
OSError: [Errno 5] 输入/输出错误
Does anyone have a same problem?有没有人有同样的问题?
Does anyone know how to solve this?有谁知道如何解决这个问题?
There are quota set by Google which are not necessary shown while using Colab.谷歌设置的配额在使用 Colab 时不需要显示。 I have run in the same problem.
我遇到了同样的问题。 Basically, once the limit is passed you get the [Errno 5] Input/output error independent on the file or the operation you were doing.
基本上,一旦超过限制,您就会得到独立于文件或您正在执行的操作的 [Errno 5] 输入/输出错误。
The problem seems to be solved since I asked to increase the quota regarding storage (limited to 1 TB total per we).这个问题似乎已经解决了,因为我要求增加有关存储的配额(每个我们总共限制为 1 TB)。 You access the quota page by visiting this page and clicking on quota: https://cloud.google.com/docs/quota
您可以通过访问此页面并单击配额来访问配额页面: https ://cloud.google.com/docs/quota
If you don't ask to increase the quota, you might have to wait for 7-14 days until your usage is set back to 0 and can use the full quota.如果您不要求增加配额,则可能需要等待 7-14 天,直到您的使用量重新设置为 0 并且可以使用全部配额。
I hope this helps!我希望这有帮助!
I've encounter the same error (during too intensive testing of transfer learning).我遇到了同样的错误(在迁移学习的密集测试期间)。 According to Google the reason may be in too many I/O operations with small files or due to shared and more intensively used resources - every reason related to usage of Google drive.
根据谷歌的说法,原因可能是小文件的 I/O 操作过多,或者是由于共享和更密集使用的资源 - 每一个原因都与谷歌驱动器的使用有关。 Mostly after 1 day the quota should be refreshed.
大多数情况下,应在 1 天后刷新配额。
You may also try another solution (for impatient users like me) - copy your resources (in my case a zipped folder data
containing folders train
and validation
with images) as a zip file to your Google drive and then unzip it directly into Colab VM by use of:您也可以尝试另一种解决方案(对于像我这样不耐烦的用户) - 将您的资源(在我的情况下是包含文件夹
train
和图像validation
的压缩文件夹data
)作为 zip 文件复制到您的 Google 驱动器,然后通过以下方式将其直接解压缩到 Colab VM 中用于:
!unzip -qq '/content/grive/My Drive/CNN/Datafiles/data.zip'
You can then access the data from folder /content/data/... (and say Goodbye to the I/O Error ;) )然后,您可以从文件夹 /content/data/... 访问数据(并告别 I/O 错误;))
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.