简体   繁体   中英

GPU out of memory when initializing network

I am trying to initialize a CNN and then put it on my GPU for training. When I put it on GPU I get the error: (CUDA error: out of memory). I have run similar networks with no such problems. This is the only thing in cuda as I have not loaded any images as of yet. Any ideas as to what is going wrong?

I am using pytorch version 0.4.1 on a GTX 1070ti 8GB.

| NVIDIA-SMI 410.104      Driver Version: 410.104      CUDA Version: 10.0     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  GeForce GTX 107...  Off  | 00000000:01:00.0  On |                  N/A |
|  0%   43C    P2    39W / 180W |   8024MiB /  8111MiB |      0%      Default |
+-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+
| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|=============================================================================|
|    0      1129      G   /usr/lib/xorg/Xorg                            36MiB |
|    0      1164      G   /usr/bin/gnome-shell                          57MiB |
|    0      1415      G   /usr/lib/xorg/Xorg                           200MiB |
|    0      1548      G   /usr/bin/gnome-shell                          90MiB |
|    0      6323      C   /usr/bin/python3                             525MiB |
|    0      9521      C   /usr/bin/python3                            1827MiB |
|    0     18821      C   /usr/bin/python3                            4883MiB |
|    0     27137      G   ...uest-channel-token=16389326112703159917    45MiB |
|    0     29161      C   /usr/bin/python3                             355MiB |

I have tried reducing the size of the linear layers with no luck.


net = piccnn()
net.to(device)

This issue happened to me once when a GPU driver was out of date. My GPU was a 1070 4 gig. I'd recommend a reinstall of drivers and restart.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM