简体   繁体   中英

Efficient logistic regression with L1 regularization in matlab

I am searching for an efficient logistic regression implementation in matlab. I used lassoglm in matlab. But when I try with 10000 examples with 1000 features and regularization params 0.005 to 1, it is really slow. I use two fold cross validation. Starting with lambda 0.05, it is very slow and takes a lot of time.

Is there any better method?

You might want to check out LIBLINEAR . It is a free, state-of-the-art library for linear large scale learning. It has a MATLAB interface.

LIBLINEAR features several linear methods, including:

 for multi-class classification
     0 -- L2-regularized logistic regression (primal)
     1 -- L2-regularized L2-loss support vector classification (dual)
     2 -- L2-regularized L2-loss support vector classification (primal)
     3 -- L2-regularized L1-loss support vector classification (dual)
     4 -- support vector classification by Crammer and Singer
     5 -- L1-regularized L2-loss support vector classification
     6 -- L1-regularized logistic regression
     7 -- L2-regularized logistic regression (dual)
   for regression
    11 -- L2-regularized L2-loss support vector regression (primal)
    12 -- L2-regularized L2-loss support vector regression (dual)
    13 -- L2-regularized L1-loss support vector regression (dual)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM