簡體   English   中英

鑽探到Hive連接錯誤(org.apache.thrift.transport.TTransportException java.net.SocketException:管道斷開(寫入失敗))

[英]Drill to Hive connectivity error(org.apache.thrift.transport.TTransportException java.net.SocketException: Broken pipe (Write failed))

我收到以下錯誤:

ERROR hive.log - Got exception: org.apache.thrift.transport.TTransportException java.net.SocketException: Broken pipe (Write failed) while trying to connect Drill to Hive. 

對於“ Hive”,正在使用Microsoft Azure HDInsight(啟用Ranger)(遠程元存儲(MS SQL Server)),對於Drill,我正在使用與群集處於同一VNet下的其他VM。 我可以使用以下配置制作Drill Storage插件

{
"type": "hive",
"enabled": true,
"configProps": {
"hive.metastore.uris": "thrift://hn0-xyz.cloudapp.net:9083,thrift://hn1-    xyz.cloudapp.net:9083",
"hive.metastore.warehouse.dir": "/hive/warehouse",
"fs.default.name": "wasb://qwerty@demo.blob.core.windows.net",
"hive.metastore.sasl.enabled": "false"
}
}

錯誤的堆棧跟蹤:

17:57:19.515 [2779bbff-d7a9-058c-d133-b41795a0ee58:foreman] ERROR hive.log - Got exception: org.apache.thrift.transport.TTransportException java.net.SocketException: Broken pipe (Write failed)
org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe (Write failed)
        at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161) ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:65) ~[drill-hive-exec-shaded-1.9.0.jar:1.9.0]
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_get_all_databases(ThriftHiveMetastore.java:733) ~[hive-metastore-1.2.1.jar:1.2.1]
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_all_databases(ThriftHiveMetastore.java:726) ~[hive-metastore-1.2.1.jar:1.2.1]
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1031) ~[hive-metastore-1.2.1.jar:1.2.1]
        at org.apache.drill.exec.store.hive.DrillHiveMetaStoreClient.getDatabasesHelper(DrillHiveMetaStoreClient.java:205) [drill-storage-hive-core-1.9.0.jar:1.9.0]

core-site.xml:

<configuration>
<property>
        <name>fs.azure.account.keyprovider.kkhdistore.blob.core.windows.net</name>
      <value>org.apache.hadoop.fs.azure.ShellDecryptionKeyProvider</value>
</property>
<property>
    <name>fs.azure.shellkeyprovider.script</name>
    <value>/usr/lib/python2.7/dist-   packages/hdinsight_common/decrypt.sh</value>
  </property>
  <property>
    <name>fs.azure.account.key.kkhdistore.blob.core.windows.net</name>
    <value>{COPY FROM CLUSTER core-site.xml}</value>
  </property>
  <property>
    <name>fs.AbstractFileSystem.wasb.impl</name>
    <value>org.apache.hadoop.fs.azure.Wasb</value>
  </property>
 </configuration>

根據HDInsight Ports and URIs used by HDInsight的官方文檔“ Ports and URIs used by HDInsight Non-public ports ”部分,如下面的注釋所述,我懷疑您使用的Hive是手動安裝在Azure HDInsight群集上,而不是Hive群集類型。

某些服務僅在特定的群集類型上可用。 例如,HBase僅在HBase群集類型上可用。

因此,對於像Drill這樣的其他設備,舊端口9083是非公共端口,即使它們位於同一VNet下。 該問題的解決方案是按照文檔Extend HDInsight capabilities by using Azure Virtual Network文檔Extend HDInsight capabilities by using Azure Virtual Network以創建規則以允許端口進入群集的NSG。 希望能幫助到你。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM