简体   繁体   English

Spark上下文广播变量抛出java.io.NotSerializableException,即使其可序列化

[英]Spark context broadcast variable throwing java.io.NotSerializableException even though its Serializable

I am running Spark 2.1 on Ubuntu 14.04 I am trying to broadcast a lookup variable in spark. 我在Ubuntu 14.04上运行Spark 2.1,正在尝试在Spark中广播查找变量。 The variable is of type scala.collection.immutable.Map[String, MyObject] 该变量的类型为scala.collection.immutable.Map [String,MyObject]

MyObject has following fields MyObject具有以下字段

  1. 'name' of type String 字符串类型的“名称”
  2. 'address' of type String 字符串类型的“地址”
  3. 'rangeSet' of type com.google.common.collect.{TreeRangeSet} com.google.common.collect类型的“ rangeSet”。{TreeRangeSet}

Exception in thread "main" java.io.NotSerializableException: com.google.common.collect.TreeRangeSet 线程“主”中的异常java.io.NotSerializableException:com.google.common.collect.TreeRangeSet

Serialization stack:
    - object not serializable (class: com.google.common.collect.TreeRangeSet, value: {[/101.32.168.0‥/101.32.181.255][/4626:7800:4048:0:0:0:0:0‥/4626:7800:4048:ffff:ffff:ffff:ffff:ffff]})
    - field (class: com.test.MyObject, name: rangeSet, type: class com.google.common.collect.TreeRangeSet)
    - object (class com.test.MyObject, MyObject(Jack,Test,{[/101.32.168.0‥/101.32.181.255][/192.16.10.224‥/192.16.10.255][/4626:7800:4048:0:0:0:0:0‥/4626:7800:4048:ffff:ffff:ffff:ffff:ffff]}))
    - writeObject data (class: scala.collection.immutable.HashMap$SerializationProxy)
    - object (class scala.collection.immutable.HashMap$SerializationProxy, scala.collection.immutable.HashMap$SerializationProxy@708f7386)
    - writeReplace data (class: scala.collection.immutable.HashMap$SerializationProxy)

MyObject.scala MyObject.scala

import com.google.common.collect.{TreeRangeSet}
@SerialVersionUID(123L)
case class MyObject(name:String, address:String,rangeSet:TreeRangeSet[CustomInetAddress]) {
}

CustomInetAddress.java CustomInetAddress.java

public class CustomInetAddress implements Comparable<CustomInetAddress>, Serializable {

    private InetAddress inetAddress;

    public CustomInetAddress(String ip) throws UnknownHostException {
        this.inetAddress = InetAddress.getByName(ip);
    }

    public CustomInetAddress(InetAddress address) throws UnknownHostException {
        this.inetAddress = address;
    }

    @Override
    public int compareTo(final CustomInetAddress address){
        byte[] ba1 = this.inetAddress.getAddress();
        byte[] ba2 = address.inetAddress.getAddress();

        if(ba1.length < ba2.length) return -1;
        if(ba1.length > ba2.length) return 1;

        for(int i = 0; i < ba1.length; i++) {
            int b1 = unsignedByteToInt(ba1[i]);
            int b2 = unsignedByteToInt(ba2[i]);
            if(b1 == b2)
                continue;
            if(b1 < b2)
                return -1;
            else
                return 1;
        }
        return 0;
    }

    @Override
    public String toString(){
        return this.inetAddress.toString();
    }

    private int unsignedByteToInt(byte b) {
        return (int) b & 0xFF;
    }
}

TreeRangeSet[CustomInetAddress] is the actual type of the object. TreeRangeSet [CustomInetAddress]是对象的实际类型。 CustomInetAddress has one field of type InetAddress. CustomInetAddress具有一个类型为InetAddress的字段。 All of them are serializable. 所有这些都是可序列化的。 I am not sure why this is throwing exception. 我不知道为什么这会引发异常。

消息很清楚: com.google.common.collect.TreeRangeSet没有实现Serializable

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 java.io.NotSerializableException即使我实现“Serializable” - java.io.NotSerializableException even if i implement “Serializable” java.io.NotSerializableException 即使该类实现了 Serializable - java.io.NotSerializableException even tho the class implements Serializable 即使实现了可序列化也抛出java.io.NotSerializableException - java.io.NotSerializableException thrown even when serializable is implemented java.io.NotSerializableException并实现Serializable - java.io.NotSerializableException and implementing Serializable java + spark:org.apache.spark.SparkException:作业已中止:任务不可序列化:java.io.NotSerializableException - java+spark: org.apache.spark.SparkException: Job aborted: Task not serializable: java.io.NotSerializableException java.io.NotSerializableException:org.apache.spark.SparkContext序列化堆栈: - 对象不可序列化 - java.io.NotSerializableException: org.apache.spark.SparkContext Serialization stack: - object not serializable java.io.NotSerializableException - java.io.NotSerializableException 类实现Serializable时的java.io.NotSerializableException - java.io.NotSerializableException when class implements Serializable 为什么这个Spark代码会抛出java.io.NotSerializableException - Why does this Spark code throw java.io.NotSerializableException 错误:java.io.NotSerializableException - An error: java.io.NotSerializableException
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM