[英]WSO2 BAM error on running hive script “am_stats_analyzer_163” cassandra
[英]WSO2BAM REST stream input to BAM/Cassandra; can't get to the EVENT_KS data using hive query?
該問題的背景本質上是Sachini Jayasekara @ WSO2 撰寫的一篇文章,該文章稱為《 將不同的報告框架與WSO2業務活動監視器一起使用》 。 我或多或少地完全一樣,只是使用REST API定義數據流並調用REST WS API將數據推送到BAM中。 然后使用HIVE查詢獲取數據。 但是,似乎我錯過了一些東西,因為未顯示屬性數據。 因此查詢。
當前使用通過基於Perl的守護程序調用的REST api。 這將使用以下流定義和有效負載來調用REST API:
{
"name":"currentcostRealtime2.stream",
"version": "1.0.6",
"nickName": "Currentcost Realtime",
"description": "This is the Currentcost realtime stream",
"payloadData":[
{
"name":"sensor",
"type":"INT"
},
{
"name":"temp",
"type":"FLOAT"
},
{
"name":"timestamp",
"type":"STRING"
},
{
"name":"watt",
"type":"INT"
}
]
}
..和有效載荷定義..
[
{
"payloadData" : [SENSOR, TEMP, "TIMESTAMP", WATT] ,
}
]
我應該注意,有效載荷在提交之前已被字符串替換; 例如,提交的實際有效負載如下所示:
[
{
"payloadData" : [1, 18.7, "2014-06-15 16:15:56", 1] ,
}
]
查詢執行沒有明顯問題,但是我現在在BAM中遇到了HIVE查詢問題,這使我可以輸入條目,但不能輸出值。 例如,現在嘗試執行以下HIVE查詢:
CREATE TABLE IF NOT EXISTS CurrentCostDataTemp ( sensor INT, temp FLOAT, ts TIMESTAMP, watt INT )
STORED BY 'org.apache.hadoop.hive.cassandra.CassandraStorageHandler'
WITH SERDEPROPERTIES ( "cassandra.host" = "127.0.0.1",
"cassandra.port" = "9160",
"cassandra.ks.name" = "EVENT_KS",
"cassandra.ks.username" = "admin",
"cassandra.ks.password" = "admin",
"cassandra.cf.name" = "currentcostRealtime2_stream",
"cassandra.columns.mapping" = "payload_sensor, payload_temp, payload_timestamp, payload_watt" );
select * from CurrentCostDataTemp;
..但這僅給出以下內容(請參見下面的特定圖片)-例如,沒有顯示的屬性級別數據。 但是,很明顯,有給定的EVENT_KS條目輸出了4行..所以問題是我如何引用數據以提取值,或者還有其他我不知道的事情?
key sensor temp ts watt
1402816273765::192.168.1.106::9443::52
1402815283659::192.168.1.106::9443::51
1402815238323::192.168.1.106::9443::49
1402815280532::192.168.1.106::9443::50
通過使用Cqlsh檢查已驗證數據在Cassandra中-參見此處:
cqlsh:EVENT_KS> select * from "currentcostRealtime_stream";
key | Description | Name | Nick_Name | StreamId | Timestamp | Version | meta_ipAdd | payload_sensor | payload_temp | payload_timestamp | payload_watt
----------------------------------------+-----------------------------------------+----------------------------+----------------------+----------------------------------+---------------+---------+------------+----------------+--------------+---------------------+--------------
1402815283659::192.168.1.106::9443::51 | This is the Currentcost realtime stream | currentcostRealtime.stream | Currentcost Realtime | currentcostRealtime.stream:1.0.5 | 1402815283659 | 1.0.5 | null | 1 | 18.7 | 2014-06-15 14:54:43 | 1
1402815238323::192.168.1.106::9443::49 | This is the Currentcost realtime stream | currentcostRealtime.stream | Currentcost Realtime | currentcostRealtime.stream:1.0.5 | 1402815238323 | 1.0.5 | null | 1 | 18.7 | 2014-06-15 14:53:58 | 1
1402815280532::192.168.1.106::9443::50 | This is the Currentcost realtime stream | currentcostRealtime.stream | Currentcost Realtime | currentcostRealtime.stream:1.0.5 | 1402815280532 | 1.0.5 | null | 1 | 18.7 | 2014-06-15 14:54:40 | 1
1402816273765::192.168.1.106::9443::52 | This is the Currentcost realtime stream | currentcostRealtime.stream | Currentcost Realtime | currentcostRealtime.stream:1.0.5 | 1402816273765 | 1.0.5 | null | 1 | 18.7 | 2014-06-15 15:11:13 | 1
(4 rows)
cqlsh:EVENT_KS>
很可能只是我所監督的一個小問題,但是如果其他人看到了這個問題並且也可以做出回應,那將是一個很大的問題。
在外部將遠程表定義添加到MySQL DB時,將創建表和所有表,但似乎問題在於要獲取EVENT_KS表本身中的屬性數據,並通過HIVE腳本創建和訪問該屬性數據。
提前致謝!
/約根
[更新-19日星期四-已解決]可以解決此問題,並提供了一些提示。 下面的代碼現在可以正常工作了,這很好..非常感謝您抽出寶貴的時間來回復大家。
drop table CurrentCostDataTemp10;
drop table CurrentCostDataTemp_Summary10;
CREATE EXTERNAL TABLE IF NOT EXISTS CurrentCostDataTemp10 ( messageRowID STRING, payload_sensor INT, messageTimestamp BIGINT, payload_temp FLOAT, payload_timestamp BIGINT, payload_timestampmysql STRING, payload_watt INT )
STORED BY 'org.apache.hadoop.hive.cassandra.CassandraStorageHandler'
WITH SERDEPROPERTIES ( "cassandra.host" = "127.0.0.1",
"cassandra.port" = "9160",
"cassandra.ks.name" = "EVENT_KS",
"cassandra.ks.username" = "<USER>",
"cassandra.ks.password" = "<PASSWORD>",
"cassandra.cf.name" = "currentcostsimple5_stream",
"cassandra.columns.mapping" = ":key, payload_sensor, Timestamp, payload_temp, payload_timestamp, payload_timestampmysql, payload_watt" );
CREATE EXTERNAL TABLE IF NOT EXISTS CurrentCostDataTemp_Summary10 ( messageRowID STRING, payload_sensor INT, messageTimestamp BIGINT, payload_temp FLOAT, payload_timestamp BIGINT, payload_timestampmysql STRING, payload_watt INT )
STORED BY 'org.wso2.carbon.hadoop.hive.jdbc.storage.JDBCStorageHandler'
TBLPROPERTIES (
'mapred.jdbc.driver.class' = 'com.mysql.jdbc.Driver',
'mapred.jdbc.url' = 'jdbc:mysql://127.0.0.1:8889/currentcost' ,
'mapred.jdbc.username' = '<USER>',
'mapred.jdbc.password' = '<PASSWORD>',
'hive.jdbc.update.on.duplicate'= 'true',
'hive.jdbc.primary.key.fields' = 'messageRowID',
'hive.jdbc.table.create.query' = 'CREATE TABLE CurrentCostDataTemp1 ( messageRowID VARCHAR(100) NOT NULL PRIMARY KEY, payload_sensor TINYINT(4), messageTimestamp BIGINT, payload_temp FLOAT, payload_timestamp BIGINT, payload_timestampmysql DATETIME, payload_watt INT ) ');
insert overwrite table CurrentCostDataTemp_Summary10 select messageRowID, payload_sensor, messageTimestamp, payload_temp, payload_timestamp, payload_timestampmysql, payload_watt FROM CurrentCostDataTemp10;
嘗試如下更改腳本的第一行。
如果不存在,則創建EXTERNAL
表CurrentCostDataTemp( key STRING
,傳感器INT,溫度FLOAT,ts TIMESTAMP,瓦特INT)
(如果出現錯誤,請刪除key STRING
部分。)
注意:如果您已經創建了DROP TABLE CurrentCostDataTemp
,則可能需要在運行之前運行它。
我已將您的查詢修改如下。 請嘗試一下。
CREATE external TABLE IF NOT EXISTS CurrentCostDataTemp ( key string, sensor INT, temp FLOAT, ts TIMESTAMP, watt INT )
STORED BY 'org.apache.hadoop.hive.cassandra.CassandraStorageHandler'
WITH SERDEPROPERTIES ( "cassandra.host" = "127.0.0.1",
"cassandra.port" = "9160",
"cassandra.ks.name" = "EVENT_KS",
"cassandra.ks.username" = "admin",
"cassandra.ks.password" = "admin",
"cassandra.cf.name" = "currentcostRealtime2_stream",
"cassandra.columns.mapping" = ":key,payload_sensor, payload_temp, payload_timestamp, payload_watt" );
select * from CurrentCostDataTemp;
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.