简体   繁体   English

如何在Scala中为函数分配不同的返回类型?

[英]How can assign different return types to a function in Scala?

I am trying to write a function which should return different pairs depending on the input. 我正在尝试编写一个函数,该函数应根据输入返回不同的对。 I have override the "+ - / *" in Scala for my specific use. 对于我的特定用途,我已经覆盖了Scala中的“ +-/ *”。 Each one ( +, -,* ,/) has three implementations based on the input. 每个(+,-,*,/)都有基于输入的三种实现。 I have RDD and Float as inputs so it can be a + between RDD and RDD, or Float and RDD, or Float and Float and so on. 我有RDD和Float作为输入,因此它可以是RDD和RDD之间,或Float和RDD之间,或Float和Float之间的+。

Now I am having a parser which reads expression from input like : RDD+1 , parse it and create postfix to make calculations easier like : RDD1+ and then I want to do a calculation using my implemented + . 现在,我有一个解析器,该解析器从输入中读取表达式:RDD + 1,对其进行解析并创建后缀,以使计算更容易,例如:RDD1 +,然后我想使用实现的+进行计算。 with the help of this algorithm I am trying to change it in a way to make it performing a calculation based on my input expression. 这种算法的帮助下,我试图以某种方式更改它,使其根据我的输入表达式执行计算。 For instance it contains: 例如,它包含:

 var lastOp: (Float, Float) => Float = add

How can I change this: (Float, Float) => Float to something that will accept (RDD, Float)|(RDD, RDD) |(Float, Float) => RDD = add // my implementation of add ??? 我该如何更改: (Float, Float) => Float可以接受(RDD, Float)|(RDD, RDD) |(Float, Float) => RDD = add //我的实现add ???

Edition: 版:

I added this part with the help of two answers below: Ok I wrote this : 我在以下两个答案的帮助下添加了这一部分:好的,我这样写:

     def lastop:(Either[RDD[(Int,Array[Float])], Float], Either[RDD[(Int,Array[Float])], Float]) => RDD[(Int,Array[Float])] = sv.+

in which sv is an instance from my other class that I have been override + in that but in two different ways so now I ma getting an error which I guess is because compiler gets confused about which implementation to use this is the 其中sv是我的其他类的实例,在那我已经对其进行了重写+,但是通过两种不同的方式,所以现在我遇到了一个错误,我猜是因为编译器对于使用哪个实现感到困惑

       error:  type mismatch;
       [error]  found   : (that: org.apache.spark.rdd.RDD[(Int, Array[Float])])org.apache.spark.rdd.RDD[(Int, Array[Float])] <and> (that: Float)org.apache.spark.rdd.RDD[(Int, Array[Float])]
       [error]  required: (Either[org.apache.spark.rdd.RDD[(Int, Array[Float])],Float], Either[org.apache.spark.rdd.RDD[(Int, Array[Float])],Float]) => org.apache.spark.rdd.RDD[(Int, Array[Float])]

Note: what it says it found are two different implementations for "+" 注意:发现的内容是“ +”的两种不同实现

Well, I'm not sure this is the best way to do it, but it is ONE way to do it and should result in the usage you described (or at least close to it): 好吧,我不确定这是最好的方法,但是这是一种方法,应该会导致您描述的用法(或至少接近它):

import scala.language.implicitConversions

// implicit conversions
implicit def float2Either(in: Float): Either[Float, RDD[(Int,Array[Float])]] = Left(in)
implicit def rdd2Either(in: RDD[(Int,Array[Float])]): Either[Float, RDD[(Int,Array[Float])]] = Right(in)

def add(left: Either[Float, RDD[(Int,Array[Float])]], right: Either[Float, RDD[(Int,Array[Float])]]): Float = {
  (left, right) match {
    case (Left(someFloat), Left(anotherFloat)) => ???
    case (Left(someFloat), Right(someRdd)) => ???
    case (Right(someRdd), Left(someFloat)) => ???
    case (Right(someRdd), Right(anotherRdd)) => ???
  }
}
val lastOp: (Either[Float, RDD[(Int,Array[Float])]], Either[Float, RDD[(Int,Array[Float])]]) => Float = add

Another way, and probably the better one, would be the pimp my library pattern. 另一种方法(可能是更好的方法)是皮条客我的图书馆模式。

However, you would not be able to decide yourself what (float + float) would yield. 但是,您将无法自行决定产生什么(浮动+浮动)。 Which in the most sane cases should not be a problem. 在最理智的情况下,哪一个都不是问题。

You could write implicit wrapper classes for Float and RDD much like 'RichFloat' 'RichInt' and the like. 您可以为Float和RDD编写隐式包装器类,就像'RichFloat''RichInt'之类。 implementing operators for each that will accept the other as input. 为每个将接受另一个作为输入的实现操作符。

implicit class RichRdd(val underlying: RDD) extends AnyVal {
  def +(in: Float): Float = ???
  def +(in: Test): Float = ???
}
implicit class RicherFloat(val underlying: Float) extends AnyVal {
  def +(in: RDD): Float = ???
}

I think pattern matching is the right way to go, you might need to do more research around operator overloading. 我认为模式匹配是正确的方法,您可能需要对运算符重载进行更多研究。

About RDD, it should be a collection of elements in Spark where I don't know what you are trying to achieve by adding a list to a number, only one element in the RDD? 关于RDD,它应该是Spark中元素集合,在其中我不知道您要通过向数字添加列表来实现什么,而RDD中只有一个元素? ..etc. ..等等。

Without knowing exactly what you want, here is an example showing how can you handle different combo of types using pattern matching: 在不确切知道您想要什么的情况下,以下示例显示了如何使用模式匹配来处理不同的类型组合:

import math.hypot

object test {

    def myadd(x: Any, y: Any) = (x, y) match {
        case (x: String, y:String) => x.toInt + y.toInt
        case (x: String, y:Int) => x.toInt + y.toInt
        case (x: Int, y:String) => x + y.toInt
        case (x: Int, y:Int) => x + y
        case _ =>
    }                                         //> myadd: (x: Any, y: Any)AnyVal

    var result = myadd(1,2)                   //> result  : AnyVal = 3
    println(result)                           //> 3
    println(result.getClass())                //> class java.lang.Integer

    result = myadd(1,"2")
    println(result)                           //> 3
    println(result.getClass())                //> class java.lang.Integer

    result = myadd(1.0,2)
    println(result)                           //> ()
    println(result.getClass())                //> class scala.runtime.BoxedUnit

}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM