簡體   English   中英

通過Snappy數據中的Java通過JSON對象插入時出錯

[英]getting error while json object insertion through java in snappy data

我有一個表,其中包含json對象和數組作為兩個字段的數據類型。

snSession.sql("CREATE TABLE subscriber_new14 (ID int,skills Map<STRING,INTEGER> ) USING column OPTIONS (PARTITION_BY 'ID',OVERFLOW 'true',EVICTION_BY 'LRUHEAPPERCENT' )");

我在Java中的代碼是

PreparedStatement s2 = snappy.prepareStatement("insert into APP.SUBSCRIBER_NEW11(ID ,SKILLS ) values(?,?)");
JSONObject obj = new JSONObject();
String str = obj.toString();
obj.put(1, 1);
obj.put(2, 2);
s2.setObject(26,obj);
l1= s2.executeBatch();

執行此時收到此錯誤

    SEVERE: null
java.sql.SQLException: (SQLState=XCL12 Severity=20000) An attempt was made to put a data value of type 'org.json.simple.JSONObject' into a data value of type 'Blob' for column '26'.
    at com.pivotal.gemfirexd.internal.shared.common.error.DefaultExceptionFactory30.getSQLException(DefaultExceptionFactory30.java:44)
    at com.pivotal.gemfirexd.internal.shared.common.error.DefaultExceptionFactory30.getSQLException(DefaultExceptionFactory30.java:63)
    at com.pivotal.gemfirexd.internal.shared.common.error.ExceptionUtil.newSQLException(ExceptionUtil.java:158)
    at io.snappydata.thrift.common.Converters.newTypeSetConversionException(Converters.java:3014)
    at io.snappydata.thrift.common.Converters.newTypeSetConversionException(Converters.java:3021)
    at io.snappydata.thrift.common.Converters$14.setObject(Converters.java:2126)
    at io.snappydata.thrift.common.Converters$21.setObject(Converters.java:2874)
    at io.snappydata.thrift.internal.ClientPreparedStatement.setObject(ClientPreparedStatement.java:611)
    at snappy.SnappyOps.upsert(SnappyOps.java:117)
    at snappy.Mailthread.DataPush(Mailthread.java:55)
    at snappy.Mailthread.run(Mailthread.java:36)
    at java.lang.Thread.run(Thread.java:748)
    Blob blob = snappy.createBlob();
    blob.setBytes(1, str.getBytes());

所以我通過添加這個將json對象更改為blob類型

 Blob blob = snappy.createBlob();
 blob.setBytes(1, str.getBytes());

但是當我通過從快照數據庫檢索時

從subscriber_new11限制10中選擇技能;

出現此錯誤時,活潑的數據丟失

在查詢這個

`select skills from  subscriber_new11 limit 10;`

出錯

ERROR 38000: (SQLState=38000 Severity=20000) (Server=host1/103.18.248.32[1529] Thread=ThriftProcessor-0) The exception 'Job aborted due to stage failure: Task 0 in stage 18.0 failed 4 times, most recent failure: Lost task 0.3 in stage 18.0 (TID 29, host1, executor 103.18.248.32(332515):52609): java.lang.AssertionError: assertion failed
    at scala.Predef$.assert(Predef.scala:156)
    at org.apache.spark.sql.catalyst.util.SerializedMap.pointTo(SerializedMap.scala:78)
    at org.apache.spark.sql.execution.row.ResultSetDecoder.readMap(ResultSetDecoder.scala:134)
    at org.apache.spark.sql.execution.row.ResultSetDecoder.readMap(ResultSetDecoder.scala:32)
    at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:180)
    at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
    at org.apache.spark.sql.execution.WholeStageCodegenRDD$$anon$2.hasNext(WholeStageCodegenExec.scala:571)
    at org.apache.spark.sql.execution.WholeStageCodegenRDD$$anon$1.hasNext(WholeStageCodegenExec.scala:508)
    at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:389)
    at org.apache.spark.sql.CachedDataFrame$.apply(CachedDataFrame.scala:451)
    at org.apache.spark.sql.CachedDataFrame$.apply(CachedDataFrame.scala:409)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:95)
    at org.apache.spark.scheduler.Task.run(Task.scala:126)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:326)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at org.apache.spark.executor.SnappyExecutor$$anon$2$$anon$3.run(SnappyExecutor.scala:57)
    at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:' was thrown while evaluating an expression.

您可以從示例中引用JDBCWithComplexTypes.scala類,該類說明了如何使用JDBC客戶端連接處理復雜的數據類型。 在PreparedStatement中設置值之前,應使用ComplexTypeSerializer序列化數組對象。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM