繁体   English   中英

Hive UDF-Java字符串castexception

[英]Hive UDF - Java String castexception

我已经编写了UDF,它可以解码cookie并返回字符串列表。 不幸的是我在处理时遇到了Hive Runtime错误

这是我的代码:

@Override
public ObjectInspector initialize(ObjectInspector[] input) throws UDFArgumentException {

    ObjectInspector cookieContent = input[0];
    if (!(isStringOI(cookieContent))){
        throw new UDFArgumentException("only string");
    }
    this.cookieValue = (StringObjectInspector) cookieContent;
    return ObjectInspectorFactory.getStandardListObjectInspector
            (PrimitiveObjectInspectorFactory.javaStringObjectInspector);
}


public Object evaluate(DeferredObject[] input) throws HiveException {

    String encoded = cookieValue.getPrimitiveJavaObject(input[0].get());
    try {
        result = decode(encoded);
    } catch (CodeException e) {
        throw new UDFArgumentException();
    }

    return result;
}
public List<String> decode(String encoded) throws CodeException {

    decodedBase64 = Base64.decodeBase64(encoded);
    String decompressedArray = new String(getKadrs(decodedBase64));
    String kadr= decompressedArray.substring(decompressedArray.indexOf("|") + 1);
    List<String> kadrsList= new ArrayList(Arrays.asList(kadr.split(",")));
    return kadrsList;
}

private byte[] getKadrs(byte[] compressed) throws CodeException {
    Inflater decompressor = new Inflater();
    decompressor.setInput(compressed);
    ByteArrayOutputStream outPutStream = new ByteArrayOutputStream(compressed.length);
    byte temp [] = new byte[1024];
    while (!decompressor.finished()) {
        try {
            int count = decompressor.inflate(temp);
            outPutStream.write(temp, 0, count);
        }
        catch (DataFormatException e) {
            throw new CodeException ("Wrong data format", e);
        }
    }
    try {
        outPutStream.close();
    } catch (IOException e) {
        throw new CodeException ("Cant close outPutStream ", e);
    }
    return outPutStream.toByteArray();
}

结果是,可以说:

“ kadr1,kadr20,kadr35,kadr12”。 单元测试工作正常,但是当我尝试在蜂巢中使用此功能时,我得到了:

   Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.hadoop.io.Text
  at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableStringObjectInspector.getPrimitiveWritableObject(WritableStringObjectInspector.java:41)

我很难调试,因为其他人必须实现我的jar才能看到结果,因此所有建议将不胜感激。

您的evaluate方法当前返回String ,这不是Hadoop数据类型。 您应该改为说return new Text(result)来将字符串包装在Text对象中。

拉文德拉是对的

我在初始化
返回ObjectInspectorFactory.getStandardListObjectInspector(PrimitiveObjectInspectorFactory.writableStringObjectInspector);

和WritableStringObjectInspector返回Text

我将其更改为javaStringObjectInspector,它返回String,一切都很好,谢谢

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM