[英]Is it possible to have compiler specific code sections in scala
I have a situation where, I need to certain functionality that is available in Spark library version 1.1.0, But I have two different platforms I need to run this application one. 我遇到的情况是,我需要Spark库版本1.1.0中提供的某些功能,但是我有两个不同的平台需要运行一个应用程序。 One uses Spark 1.1.0 and the other uses Spark 0.9.1.
一种使用Spark 1.1.0,另一种使用Spark 0.9.1。 The functionality available in Spark 1.1.0 is not available in Spark 0.9.1.
Spark 1.1.0中不提供Spark 1.0.1中可用的功能。
That said, is it possible to have some compiler flags in the scala code, so that when compiling with Spark 1.1.0 certain code gets compiled and when compiling using the Spark 0.9.1. 就是说,在scala代码中是否可以包含一些编译器标志,以便在使用Spark 1.1.0进行编译时,可以编译某些代码,以及在使用Spark 0.9.1进行编译时。 library another piece of code gets compiled?
库另一段代码被编译?
like so : 像这样:
#ifSpark1.1.0
val docIdtoSeq: RDD[(String, Long)] = listOfDocIds.zipWithIndex()
#endifSpark1.1.0
#ifSpark0.9.1
val docIdtoSeq: RDD[(String, Long)] = listOfDocIds.mapPartitionsWithIndex{case(partId,it) => it.zipWithIndex.map{case(el,ind) => (el,ind+partId*constantLong)}}
#endifSpark0.9.1
Many thanks 非常感谢
There are several options. 有几种选择。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.