I think this is a Java interop question, but perhaps it's a nuance of Spark (or failing to import something from Scala?). I am trying to follow along with the example here . It includes the following code:
import org.apache.spark.mllib.linalg.Vector;
import org.apache.spark.mllib.linalg.Vectors;
Vector dv = Vectors.dense(1.0, 0.0, 3.0);
My first thought was this:
(import '[org.apache.spark.mllib.linalg Vector Vectors])
(Vectors/dense 1.0 0.0 3.0)
But I get:
CompilerException java.lang.IllegalArgumentException: No matching method: dense, compiling:(/tmp/form-init6598386874684927469.clj:1:1)
I tried:
(.dense (Vectors. 1.0 0.0 3.0))
and got:
CompilerException java.lang.IllegalArgumentException: No matching ctor found for class org.apache.spark.mllib.linalg.Vectors, compiling:(/tmp/form-init6598386874684927469.clj:1:9)
I've tried a variety of additional combinations (vectors, Java arrays, the ..
macro, etc.), but can't get it to work. In particular, even though the documentation shows a constructor that takes no arguments ( see here ), nothing like (Vectors.)
seems to work (a No matching ctor
) error. Thanks.
Did you try (Vectors/dense (double-array [1.0 0.0 3.0]))
? I didn't test this myself but it should match the Vector dense (double[] values)
method in Vectors .
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.