简体   繁体   中英

Is there a way to access local files without having to use upload() option in Google Colab or uploading the data to the drive and then accessing it

I have data in my local drive spread over a lot of files. I want to access those data from Google Colab. Since it is spread over a large area and the data is susceptible to constant change I don't want to use the upload() option as it can get tedious and long. Uploading to Drive is also something I am trying to avoid, due to the changing data values. So I was wondering if there is another method to access the local data something similar to the code presented.

def list_files(dir):
    r = []
    for root, dirs, files in os.walk(dir):
        for name in dirs:
            r.append(os.path.join(root, name))
    return r

train_path = list_files('/home/path/to/folder/containing/data/')

This does not seem to work since GC cannot access my local machine. So I always get an empty array (0,) returned from the function

The short answer is: no, you can't. The long answer is: you can skip the uploading phase each time you restart the runtime. You just need to use google.colab package in order to have a similar behaviour to the local environment. Upload all the files you need to your google drive, then just import:

from google.colab import drive
drive.mount('/content/gdrive')

After the authentication part, you will be able to access all your files stored in google drive. They will be imported as you have uploaded them, so you just have to modify the last line in this way:

train_path = list_files('gdrive/path/to/folder/containing/data/')

or in this way:

train_path = list_files('/content/gdrive/home/path/to/folder/containing/data/')

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM