简体   繁体   中英

Giving more memory to mex programs in Matlab

My C mex-compiled files are running out of memory (more specifically, my calls to mxMalloc sometimes return NULL ). From a quick calculation I estimate that my files ask mxMalloc for about 500 MB of memory in total (maybe as much as 1 GB), and I run my files on a 64 bit Linux machine with 16 GB of memory running Matlab almost exclusively. Yet, mxMalloc returns NULL quite often.

I guess that that there are several reasons why this could happen (memory fragmentation, etc.), but in any case, 500 MB do not seem to be that much to ask from a 16-GB-memory machine.

I presume that my calls to mxMalloc request memory from Matlab's heap, but am not 100% sure about it ( mxMalloc requests are handled by Matlab after all, and not directly by the OS). So is there a way of increasing my mex file's "heap" size?

Thanks

You can use malloc (not mxMalloc ) but the memory must be managed manually (eg You must call free ; it's not optional like mxFree is). You could also try compiling using c++ and boost's shared/scoped pointers. Your mexFunction signature will need extern "C" prepended to it.

One reason you may be running out of memory is that your arrays are too long. mxArray must be contiguous data. So mxMalloc ing them and the mxFree ing them could fragment your memory space. While std::vector must be contiguous too, std::list doesn't have to be.

Something like the original STL's rope class could be a good implementation too.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM