简体   繁体   中英

large sparse matrix, svd

I want to calculate SVD , but I didn't find good java library for this. Now, I have data store in hashmap, because matrix didn't fit into memory due to the fact that sizes are about 400 000 X 10 000 and most of them are 0. I tried MTJ, JBLAS, Jama and others but most of them don't support sparse matrices or are too slow. I need this calculation to be done in max 2-3 minutes. Can somebody recommend me something ? I read also about irbla in R, but it is possible to send my data from java to R, make calculations and send it back to my java program ?

I faced a similar problem while trying to apply Non-negative Matrix Factorization (NNMF) and Probabilistic Latent Semantic Analysis to large sparse term-document matrices. I tried Jblas 1 and Jama 2 but I finally used Matlab. And because I was writting the whole app in Java I ended up calling Matlab with Java through the Matlab Runtime Compiler (MRC) .

What to do:
Matlab has a feature called Matlab Builder JA . Use this tool to produce a .jar file which has your matlab code and can then be called by your java program. This .jar needs the MRC to work.

1 Jblas was much faster than Jama but I had problems and couldn't make it run with jdk for 64bits.
2 Jama used doube[][] and this caused problems with memory.
Maybe the above have changed because I am reffering back to July 2012 and don't have clear image about now.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM