简体   繁体   中英

Converting a Scala map into Python for Spark

As par of Random Forest example:

Here is some context

val forest = RandomForest.trainClassifier(
  trainData, 7, Map(10 -> 4, 11 -> 40), 20,
  "auto", "entropy", 30, 300)
  1. I am not sure what does Map(10 -> 4, 11 -> 40) mean ?
  2. Whats the python or pyspark equivalent for this ?

This is essentially the equivalent of a Python dictionary

So

Map(10 -> 4, 11 -> 40)

Becomes

categoricalFeaturesInfo={10: 4, 11: 40}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM