简体   繁体   English

Spark中发现相同和所需数据类型的“错误:类型不匹配”

[英]“error: type mismatch” in Spark with same found and required datatypes

I am using spark-shell for running my code. 我正在使用spark-shell来运行我的代码。 In my code, I have defined a function and I call that function with its parameters. 在我的代码中,我定义了一个函数,并使用其参数调用该函数。

The problem is that I get the below error when I call the function. 问题是调用函数时出现以下错误。

error: type mismatch;

found   : org.apache.spark.graphx.Graph[VertexProperty(in class $iwC)(in class $iwC)(in class $iwC)(in class $iwC),String]

required: org.apache.spark.graphx.Graph[VertexProperty(in class $iwC)(in class $iwC)(in class $iwC)(in class $iwC),String]

What is the reason behind this error? 此错误的原因是什么? Has it got anything to do with Graph datatype in Spark? 它与Spark中的Graph数据类型有关系吗?

Code : This is the part of my code which involves the definition and call of the function "countpermissions". 代码:这是我代码的一部分,涉及函数“ countpermissions”的定义和调用。

class VertexProperty(val id:Long) extends Serializable
case class User(val userId:Long, val userCode:String, val Name:String, val Surname:String) extends VertexProperty(userId)
case class Entitlement(val entitlementId:Long, val name:String) extends VertexProperty(entitlementId)

def countpermissions(es:String, sg:Graph[VertexProperty,String]):Long = {
    return 0
}

val triplets = graph.triplets
val temp = triplets.map(t => t.attr)
val distinct_edge_string = temp.distinct    
var bcast_graph = sc.broadcast(graph)        
val edge_string_subgraph = distinct_edge_string.map(es => es -> bcast_graph.value.subgraph(epred = t => t.attr == es))
val temp1 = edge_string_subgraph.map(t => t._1 -> countpermissions(t._1, t._2))

The code runs without errors until the last line, where it gets the above mentioned error. 该代码将无错误运行,直到出现上述错误的最后一行。

Here is the trick. 这是窍门。 Lets open the REPL and define a class: 让我们打开REPL并定义一个类:

scala> case class Foo(i: Int)
defined class Foo

and a simple function which operates on this class: 以及一个在此类上运行的简单函数:

scala> def fooToInt(foo: Foo) = foo.i
fooToInt: (foo: Foo)Int

redefine the class: 重新定义类:

scala> case class Foo(i: Int)
defined class Foo

and create an instance: 并创建一个实例:

scala> val foo = Foo(1)
foo: Foo = Foo(1)

All whats left is to call fooToInt : 剩下的就是调用fooToInt

scala> fooToInt(foo)
<console>:34: error: type mismatch;
 found   : Foo(in class $iwC)(in class $iwC)(in class $iwC)(in class $iwC)
 required: Foo(in class $iwC)(in class $iwC)(in class $iwC)(in class $iwC)
          fooToInt(foo)

Does it look familiar? 看起来熟悉吗? Yet another trick to get a better idea what is going on: 更好地了解发生了什么的另一个技巧:

scala> case class Foo(i: Int)
defined class Foo

scala> val foo = Foo(1)
foo: Foo = Foo(1)

scala> case class Foo(i: Int)
defined class Foo

scala> def fooToInt(foo: Foo) = foo.i
<console>:31: error: reference to Foo is ambiguous;
it is imported twice in the same scope by
import INSTANCE.Foo
and import INSTANCE.Foo
         def fooToInt(foo: Foo) = foo.i

So long story short this is an expected, although slightly confusing, behavior which arises from ambiguous definitions existing in the same scope. 长话短说,这是一个预期的行为,尽管有些混乱,但是它是由相同范围中存在的不明确定义引起的。

Unless you want to periodically :reset REPL state you should keep track of entities you create and if types definitions change make sure that no ambiguous definitions persist (overwrite things if needed) before you proceed. 除非您要定期:reset REPL状态,否则您应该跟踪创建的实体,并且如果类型定义发生更改,请确保在继续操作之前,不存在任何不明确的定义(如有必要,请覆盖内容)。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Spark - 错误:类型不匹配; 发现:(Int,String)必需:TraversableOnce [?] - Spark --Error :type mismatch; found : (Int, String) required: TraversableOnce[?] 错误:类型不匹配; 找到:org.apache.spark.sql. 所需列:Int - error: type mismatch; found : org.apache.spark.sql.Column required: Int 错误:类型不匹配; 找到:Unit.type必需:Unit - error: type mismatch; found: Unit.type required: Unit 火花类型不匹配错误 - Spark type mismatch error Scala〜类型不匹配; 找到:_ $ 1必需:_ $ 2 - Scala ~ type mismatch; found : _$1 required: _$2 错误:类型不匹配; 找到:(Int,Int)=&gt;需要Int:Int - error: type mismatch; found : (Int, Int) => Int required: Int 为什么出现此错误:类型不匹配; 找到:所需单位:字符串=&gt;单位 - Why this error : type mismatch; found : Unit required: String => Unit 得到“错误:类型不匹配;发现:需要单位:()=>单位“回调 - Getting “error: type mismatch; found : Unit required: () => Unit” on callback 类型不匹配; 发现:org.apache.spark.sql.DataFrame 需要:org.apache.spark.rdd.RDD - type mismatch; found : org.apache.spark.sql.DataFrame required: org.apache.spark.rdd.RDD 找到:所需单位:for循环的List [Int]类型不匹配错误 - found : Unit required: List[Int] type mismatch error for for loop
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM