简体   繁体   中英

Exception while using Hbase custom filter

I have written a Hbase custom Filter extending FilterBase and converted into JAR. The filter looks like this :

public class MyFilter1 extends FilterBase implements Serializable{
boolean filterRow= true;
String srh;

public MyFilter1(String str) {
    this.srh= str;
}

@Override
public ReturnCode filterKeyValue(Cell c) throws IOException {
    String str= Bytes.toString(c.getValue());

    if(str.contains(str)) {
        filterRow= false;
        return ReturnCode.INCLUDE;
    }

    filterRow= true;
    return ReturnCode.SKIP;
}

@Override
public  boolean filterRow() {
    return filterRow;
}

@Override
public byte[] toByteArray() throws IOException {
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    ObjectOutputStream os = new ObjectOutputStream(out);
    os.writeObject(this);
    return out.toByteArray();
}

public static MyFilter1 parseFrom(final byte[] data) {
    ByteArrayInputStream in = new ByteArrayInputStream(data);
    MyFilter1 ans= null;
    ObjectInputStream is;
    try {
        is = new ObjectInputStream(in);
        ans= (MyFilter1)is.readObject();;
    } catch (Exception e) {
        e.printStackTrace();
    }
    return ans;
}

}

After making the JAR file (ie MyFilter.jar) I put it in /use/local/HBase/lib/filters directory. Then I set

export HBASE_CLASSPATH="/usr/local/Hbase/lib/filters/MyFilter.jar"

in hbase-env.sh filter and restarted the hbase server. Then I used the custom filter from java program :

public static void main(String argv[]) throws IOException {
        Configuration conf= HBaseConfiguration.create();
        Connection con= ConnectionFactory.createConnection(conf);

        Table table= con.getTable(TableName.valueOf("stud"));

        Filter fl= new MyFilter("uc");

        Scan sc= new Scan();
        sc.setFilter(fl);

        ResultScanner rs= table.getScanner(sc);

        for(Result r : rs)
            System.out.println(Bytes.toString(r.getValue(Bytes.toBytes("perData"), Bytes.toBytes("name"))));
    }

But getting following exception

Exception in thread "main" org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.toFilter(ProtobufUtil.java:1478)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.toScan(ProtobufUtil.java:993)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2396)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33648)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2180)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.toFilter(ProtobufUtil.java:1474)
... 8 more
Caused by: org.apache.hadoop.hbase.exceptions.DeserializationException: parseFrom called on base Filter, but should be called on derived type
at org.apache.hadoop.hbase.filter.Filter.parseFrom(Filter.java:270)
... 13 more

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:329)
at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:408)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:204)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:65)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:210)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:364)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:338)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:65)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException): org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.toFilter(ProtobufUtil.java:1478)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.toScan(ProtobufUtil.java:993)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2396)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33648)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2180)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.toFilter(ProtobufUtil.java:1474)
... 8 more
Caused by: org.apache.hadoop.hbase.exceptions.DeserializationException: parseFrom called on base Filter, but should be called on derived type
at org.apache.hadoop.hbase.filter.Filter.parseFrom(Filter.java:270)
... 13 more

at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1267)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34094)
at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:400)
... 10 more

Can any one please help me with this...

Just to add a (perhaps important) detail to the answers from Abhishek Kumar above. It seems the serializer toByteArray() and deserializer parseFrom(byte[] rawBytes) must be implemented via Google Protocol Buffers 2. Below is an example implementation.

AFilter.java

// whatever fields you need for the AFilter
long fieldA;
long fieldB;   

/**
* Transform this {@code AFilter} instance to a byte array for serialization.  
* @return raw bytes of this instance
*/
@Override
public byte[] toByteArray() {
    final FilterProtos.AFilter.Builder builder = FilterProtos.AFilter.newBuilder();
    builder.setFieldA(fieldA);
    builder.setFieldB(fieldB);

    return builder.build().toByteArray();
}

/**
 * De-serialize {@code AFilter} from {@code rawBytes}.
 *
 * @param rawBytes raw bytes of the filter
 * @return AFilter object
 * @throws DeserializationException
 */
public static AFilter parseFrom(final byte[] rawBytes)
        throws DeserializationException {

    try {
        FilterProtos.AFilter proto;
        proto = FilterProtos.AFilter.parseFrom(rawBytes);

        return new AFilter(proto.getFieldA(), proto.getFieldB());
    } catch (InvalidProtocolBufferException ex) {
        throw new DeserializationException(
                ex);
    }
}

Filters.proto

option java_package = "my.java.package";
option java_outer_classname = "FilterProtos";
option java_generic_services = true;
option java_generate_equals_and_hash = true;
option optimize_for = SPEED;

message AFilter{
    required uint64 fieldA = 1;
    required uint64 fieldB = 2;
}

As you can see in the exception stack, it is caused due to " parseFrom called on base Filter, but should be called on derived type".

That means you have to implement parseFrom in you custom filter class as well.

Also, you you might need to implement toByteArray as well as they are used in conjunction.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM