简体   繁体   English

将Java转换为Scala代码-方法签名的更改

[英]convert java to scala code - change of method signatures

Trying to convert some java to scala code I face the problem of a different method signature which compiled fine in the java world: 尝试将一些Java转换为Scala代码时,我遇到了在Java世界中编译良好的不同方法签名的问题:

The following code in java (from https://github.com/DataSystemsLab/GeoSpark/blob/master/babylon/src/main/java/org/datasyslab/babylon/showcase/Example.java#L122-L126 ) Java中的以下代码(来自https://github.com/DataSystemsLab/GeoSpark/blob/master/babylon/src/main/java/org/datasyslab/babylon/showcase/Example.java#L122-L126

visualizationOperator = new ScatterPlot(1000,600,USMainLandBoundary,false,-1,-1,true,true);
visualizationOperator.CustomizeColor(255, 255, 255, 255, Color.GREEN, true);
visualizationOperator.Visualize(sparkContext, spatialRDD);
imageGenerator = new SparkImageGenerator();
imageGenerator.SaveAsFile(visualizationOperator.distributedVectorImage, "file://"+outputPath,ImageType.SVG);

Is translated to https://github.com/geoHeil/geoSparkScalaSample/blob/master/src/main/scala/myOrg/visualization/Vis.scala#L45-L57 被翻译为https://github.com/geoHeil/geoSparkScalaSample/blob/master/src/main/scala/myOrg/visualization/Vis.scala#L45-L57

val vDistributedVector = new ScatterPlot(1000, 600, USMainLandBoundary, false, -1, -1, true, true)
vDistributedVector.CustomizeColor(255, 255, 255, 255, Color.GREEN, true)
vDistributedVector.Visualize(s, spatialRDD)
sparkImageGenerator.SaveAsFile(vDistributedVector.distributedVectorImage, outputPath + "distributedVector", ImageType.SVG)

Which will throw the following error: 这将引发以下错误:

overloaded method value SaveAsFile with alternatives:
[error]   (x$1: java.util.List[String],x$2: String,x$3: org.datasyslab.babylon.utils.ImageType)Boolean <and>
[error]   (x$1: java.awt.image.BufferedImage,x$2: String,x$3: org.datasyslab.babylon.utils.ImageType)Boolean <and>
[error]   (x$1: org.apache.spark.api.java.JavaPairRDD,x$2: String,x$3: org.datasyslab.babylon.utils.ImageType)Boolean
[error]  cannot be applied to (org.apache.spark.api.java.JavaPairRDD[Integer,String], String, org.datasyslab.babylon.utils.ImageType)
[error]     sparkImageGenerator.SaveAsFile(vDistributedVector.distributedVectorImage, outputPath + "distributedVector", ImageType.SVG)

Unfortunately, I am not really sure how to fix this / how to properly call the method in scala. 不幸的是,我不确定如何解决此问题/如何在scala中正确调用该方法。

This is a problem in ImageGenerator , inherited by SparkImageGenerator . 这是一个问题ImageGenerator ,通过继承SparkImageGenerator As you can see here , it has a method 正如你可以看到这里 ,它有一个方法

public boolean SaveAsFile(JavaPairRDD distributedImage, String outputPath, ImageType imageType)

which uses a raw type ( JavaPairRDD without <...> ). 它使用原始类型(不带<...> JavaPairRDD )。 They exist primarily for compatibility with pre-Java 5 code and shouldn't normally be used otherwise . 它们的存在主要是为了与Java 5之前的代码兼容, 否则通常不应使用它们 For this code, there is certainly no good reason, as it actually expects specific type parameters. 对于此代码,当然没有充分的理由,因为它实际上需要特定的类型参数。 Using raw types merely loses type-safety. 使用原始类型只会失去类型安全性。 Maybe some subclasses (current or potential) might override it and expect different type parameters, but this would be a misuse of inheritance and there must be a better solution. 也许某些子类(当前或潜在)可能会覆盖它并期望使用不同的类型参数,但这将是对继承的滥用,并且必须有更好的解决方案。

Scala doesn't support raw types in any way and so you can't call this method from it (AFAIK). Scala不支持任何原始类型,因此您不能从中调用此方法(AFAIK)。 As a workaround, you could write a wrapper in Java which used correct types and call this wrapper from Scala. 解决方法是,您可以用Java写一个使用正确类型的包装器,然后从Scala调用该包装器。 I misremembered, it's extending Java classes extending raw types which was impossible, and even then there are workarounds. 我记错了,它是在扩展Java类,从而扩展了原始类型,这是不可能的,即使那样,也有解决方法。

You might be able to call it by explicit type ascription (preferable to casting): 您也许可以通过显式类型说明(最好是强制转换)来调用它:

sparkImageGenerator.SaveAsFile(
  (vDistributedVector.distributedVectorImage: JavaPairRDD[_, _]), 
  outputPath + "distributedVector", ImageType.SVG)

But given the error message shows just JavaPairRDD , I don't particularly expect it to work. 但是鉴于错误消息仅显示JavaPairRDD ,我并不特别希望它能工作。 If this fails, I'd still go with a Java wrapper. 如果失败,我仍然会使用Java包装器。

The accepted answer is correct in saying that raw types should be avoided. 可以接受的答案是正确的,即应避免使用原始类型。 However Scala can interoperate with Java code that has raw types. 但是,Scala 可以与具有原始类型的Java代码进行互操作。 Scala interprets the raw type java.util.List as the existential type java.util.List[_] . Scala将原始类型java.util.List解释为存在类型java.util.List[_]

Take for example this Java code: 以下面的Java代码为例:

// Test.java
import java.util.Map;

public class Test {
  public boolean foo(Map map, String s) {
    return true;
  }
}

Then try to call it from Scala: 然后尝试从Scala调用它:

Welcome to Scala 2.12.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131).
Type in expressions for evaluation. Or try :help.

scala> import java.util.{Map,HashMap}
import java.util.{Map,HashMap}

scala> new Test().foo(new HashMap[String,Integer], "a")
res0: Boolean = true

scala> val h: Map[_,_] = new HashMap[String,Integer]
h: java.util.Map[_, _] = {}

scala> new Test().foo(h, "a")
res1: Boolean = true

So it looks like there must be some other problem. 因此,看来还存在其他问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM