简体   繁体   中英

loading mnist fashion dataset with keras

I copied and pasted tensorflow's official Basic classification: Classify images of clothing code https://www.tensorflow.org/tutorials/keras/classification

    import tensorflow as tf
    import numpy as np
    import matplotlib.pyplot as plt
    fashion_mnist = tf.keras.datasets.fashion_mnist
    (train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data()

and ran it. Upon running it printed a load of gibberish and wouldn't stop (almost like when you accidentally put a print in a while loop):

    Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz
    
        8192/26421880 [..............................] - ETA: 6s
       98304/26421880 [..............................] - ETA: 14s
      106496/26421880 [..............................] - ETA: 27s
      417792/26421880 [..............................] - ETA: 10s
      425984/26421880 [..............................] - ETA: 13s

so I terminated it. The above is just a VERY small portion of what printed. I ran it again, only to get an error straight away.

    line 7, in <module>
        (train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data()
      File "C:\Users\david\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\datasets\fashion_mnist.py", line 82, in load_data
        imgpath.read(), np.uint8, offset=16).reshape(len(y_train), 28, 28)
      File "C:\Users\david\AppData\Local\Programs\Python\Python38\lib\gzip.py", line 292, in read
        return self._buffer.read(size)
      File "C:\Users\david\AppData\Local\Programs\Python\Python38\lib\gzip.py", line 498, in read
        raise EOFError("Compressed file ended before the end-of-stream marker was reached")

I checked a similar question, deleted the dataset causing this error and ran it again. I waited out the gibberish it was printing out and waited for it to finish running, only for it to prematurely terminate after about 30 minutes with this error:

    Traceback (most recent call last):
      File "C:\Users\david\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\utils\data_utils.py", line 275, in get_file
        urlretrieve(origin, fpath, dl_progress)
      File "C:\Users\david\AppData\Local\Programs\Python\Python38\lib\urllib\request.py", line 286, in urlretrieve
        raise ContentTooShortError(
    urllib.error.ContentTooShortError: <urlopen error retrieval incomplete: got only 9191424 out of 26421880 bytes>

During handling of the above exception, another exception occurred:
 
    Traceback (most recent call last):
      File "C:\Users\david\Documents\DAVID\Documents\Education\Computer Science\Extra stuff\Machine learning\Neural networks\ep.2.py", line 7, in <module>
        (train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data()
      File "C:\Users\david\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\datasets\fashion_mnist.py", line 75, in load_data
        paths.append(get_file(fname, origin=base + fname, cache_subdir=dirname))
      File "C:\Users\david\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\keras\utils\data_utils.py", line 279, in get_file
        raise Exception(error_msg.format(origin, e.errno, e.reason))
    Exception: URL fetch failure on https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz: None -- retrieval incomplete: got only 9191424 out of 26421880 bytes

I've tried deleting that dataset many times and each time I run the code the same thing happens again and again.

I can't find any forums or stack overflows on what to do when this happens, any help would be appreciated.

TL;DR what do I do to load a MNIST dataset when copying and pasting tensorflow's tutorial code leads to errors?

Look %USERPROFILE%\.keras\datasets folder and remove the mnist related files and folder

Someone with more knowledge on this might have an actual answer or reason for this, but for me after a long discussion with Ismail we concluded there was just something wrong with my pc or python setup or something, the solution for me was just to use google's colab or any other python shell but IDLE.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM