简体   繁体   中英

Time complexity of scipy.linalg.solve (LAPACK gesv) on large matrix?

If I use scipy.linalg.solve (which I believe calls LAPACK's gesv function) on a ~12000 unknown problem (with a ~12000-square, dense, non-symmetrical matrix) on my workstation, I get a good answer in 10-15 minutes .

Just to probe the limits of what's possible (note I don't say "useful"), I doubled the resolution of my underlying problem, which leads to needing to solve for ~50000 unknowns. While this would technically run on my workstation once I'd added some more 10s of GBytes of swap, it seemed more prudent to use some HW with adequate RAM, and so I kicked it off on an AWS EC2 High-Memory Quadruple Extra Large... where it has been grinding away for the last 14 hours (hey, spot instances are cheap) and it's impossible to tell how far through it is.

Unfortunately, I've no idea what the time complexity of the solvers involved is (my google-fu failed me on this one). If it's O(N^2) then I'd have expected it to be done after around 4 hours; if it's O(N^3) then maybe it'll be done in 16 hours. Of course that's interpreting N as the number of unknowns - which has quadrupled - the number of elements in the matrix has increased by a factor of 16!

And advice which will help me determine whether this has any chance of completing in my (project's) lifetime or not gratefully received!

Other info:

Sparse matrices aren't of interest here (my matrix is dense, and in any case scipy's don't work with more than 2**31 non-zero elements even on 64-bit).

I'm using Debian/Squeeze's scipy on the workstation, Ubuntu 12.04 on EC2. Both 64-bit obviously.

The time complexity of DGESV for an NxN matrix is O(N^3). See Table 3.13 here: http://www.netlib.org/lapack/lug/node71.html

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM