简体   繁体   中英

How to construct a Scala map and pass as arg into a Scala constructor in Python interpreter?

I am working in a Jupyter Notebook with PySpark v2.3.4 which runs on Java 8, Python 3.6 (with py4j==0.10.7), and Scala 2.11, and I have a Scala case class that takes in a Map arg as so:

case class my_class(arg1: Long, arg2: Map[String, String] = Map.empty)

I would like to construct an object from my_class , but I cannot quite figure out how to construct the Map arg. Below are a couple of my attempts/docs I've followed where sc is my SparkContext.

  • sc._jvm.scala.collection.JavaConversions.iterableAsScalaIterable([('hello','world')]).toMap()
  • MapConverter().convert({'a':'b'}, sc._gateway._gateway_client)
  • sc._jvm.scala.collection.JavaConverters.*
    • Cannot find any of the methods that exists in its docs

Those are just a couple of the attempts I've tried so far. Haven't really found good examples on how to do this so any help will be much appreciated!

I dug through PySpark's src code and found this -- seems to fix it for now for anybody's reference in the future:

sc._jvm.PythonUtils.toScalaMap({'hello':'world'})

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM