简体   繁体   中英

Raspberry Pi 4 - 8gb RAM, 64gb SD Card Running Out of Memory Trying to Load Tensorflow Model

I have a problem I'm not sure how to tackle.

I currently am trying to run a program with JupyterLab on my Raspberry Pi 4, but when trying to load a Tensorflow Model, I get the following warning in the terminal:

Allocation of 360087552 exceeds 10% of free system memory

Now, this is confounding to me. The model it's trying to load is only about 900mb. The Raspberry Pi model I have has 8gb of RAM, same as my laptop. It's using a 64gb SD card with 42.8gb free space (more than my laptop). Yet, despite having the same amount of RAM and more free space than my laptop (which runs everything without issue), it is unable to load the model, and the kernel crashes.

I've already done everything I could think of to free up memory, including Expanding the Filesystem and increasing the Memory Split to 256 in the raspi-config, and increasing the CONF_SWAPSIZE to 1024.

So, is there anything further I can do to try to resolve this issue, or is this merely a limitation of the Raspberry Pi 4, and should I look into alternative and more powerful single board computers?

Thanks for the help,

Sam

My suspicion is that you're using the 32 bit os with pae, that only allows 3gb per process. The allocation given would exceed that. Did you try with the 64 bit os?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM