简体   繁体   English

是否可以在scala中具有编译器特定的代码段

[英]Is it possible to have compiler specific code sections in scala

I have a situation where, I need to certain functionality that is available in Spark library version 1.1.0, But I have two different platforms I need to run this application one. 我遇到的情况是,我需要Spark库版本1.1.0中提供的某些功能,但是我有两个不同的平台需要运行一个应用程序。 One uses Spark 1.1.0 and the other uses Spark 0.9.1. 一种使用Spark 1.1.0,另一种使用Spark 0.9.1。 The functionality available in Spark 1.1.0 is not available in Spark 0.9.1. Spark 1.1.0中不提供Spark 1.0.1中可用的功能。

That said, is it possible to have some compiler flags in the scala code, so that when compiling with Spark 1.1.0 certain code gets compiled and when compiling using the Spark 0.9.1. 就是说,在scala代码中是否可以包含一些编译器标志,以便在使用Spark 1.1.0进行编译时,可以编译某些代码,以及在使用Spark 0.9.1进行编译时。 library another piece of code gets compiled? 库另一段代码被编译?

like so : 像这样:

#ifSpark1.1.0
val docIdtoSeq: RDD[(String, Long)] = listOfDocIds.zipWithIndex()
#endifSpark1.1.0

#ifSpark0.9.1
    val docIdtoSeq: RDD[(String, Long)] = listOfDocIds.mapPartitionsWithIndex{case(partId,it) => it.zipWithIndex.map{case(el,ind) => (el,ind+partId*constantLong)}}
#endifSpark0.9.1

Many thanks 非常感谢

There are several options. 有几种选择。

  1. Since the two Spark versions are obviously not binary compatible, you would anyway need to provide two artifacts of your projects. 由于两个Spark版本显然不是二进制兼容的,因此无论如何您都需要提供项目的两个工件。 Create a simple common API layer and then add two thin sub-projects in a multi-project sbt build that provide that layer for either Spark version. 创建一个简单的通用API层,然后在多项目sbt构建中添加两个精简子项目,为每个Spark版本提供该层。
  2. Use sbt-buildinfo to generate compile-time symbols for your Spark version, then use a macro method that pastes the two different types of method invocations above, depending on the Spark version. 使用sbt-buildinfo为您的Spark版本生成编译时符号,然后使用一个宏方法,该方法将上面两种不同类型的方法调用粘贴到上面,具体取决于Spark版本。
  3. Use runtime reflection 使用运行时反射

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM