简体   繁体   中英

SPARK: Perforf linear/logistic regression from spark-glmnet package

I'm new in Spark and for last few weeks I'm learning about methods implemented in it. This time I want to use functions implemented in spark-glmnet package: spark-glmnet . I am most interested in running logistic regression .

I downloaded a source files and created a fat JAR using command:

sbt assembly

When the process was done i copy the JAR file to a server and run Spark shell.

export HADOOP_CONF_DIR=/opt/etc-hadoop/;
/opt/spark-1.5.0-bin-hadoop2.4/bin/spark-shell \ 
--master yarn-client \
--num-executors 5 \
--executor-cores 6 \
--executor-memory 8g \
--jars /opt/spark-glmnet-assembly-1.5.jar,some_other_jars \ 
--driver-class-path /usr/share/hadoop-2.2.0/share/hadoop/common/lib/mysql-connector-java-5.1.30.jar 

But I don't know how to run functions from this package in Spark. How can I for example perform logistic regression with coordinate descent ?

答案非常简单:

 sc.addJar("path_to_my_jar")

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM