繁体   English   中英

无法从 pyspark 读取密钥库文件

[英]Unable to read keystore file from pyspark

我正在使用以下代码段将数据从 Hive 连接并加载到 elasticsearch(v 6.2),没有任何问题

ADD JAR file:///<>/elasticsearch-hadoop-hive-6.2.2.jar;
ADD FILE file:///<>/mycerts.jks;

CREATE EXTERNAL TABLE if not exists my_db.my_es_table
(
col1 int,
col2 string,
col3 string,
col4 timestamp,
key_id string
)
COMMENT 'data into ES'
STORED BY 'org.elasticsearch.hadoop.hive.EsStorageHandler'
TBLPROPERTIES('es.resource' = 'index1/type1',
'es.index.auto.create'='true',
'es.nodes'='<vip_name>',
'es.port'='9200',
'es.net.http.auth.user'='<user>',
'es.net.http.auth.pass'='pwd',
'es.net.ssl.protocol'='SSL',
'es.net.ssl'='TRUE',
'es.net.ssl.truststore.location'='mycerts.jks',
'es.net.ssl.truststore.pass'='<pwd>',
'es.mapping.id'='key_id'
);

INSERT OVERWRITE TABLE my_db.my_es_table
SELECT
col1,
col2,
col3,
col4,
key_id
FROM my_db.stagging_data;

但是,当我试图将同一块迁移到 py-spark 时,它会抛出异常

   org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Expected to find keystore file at [file:///<path>/mycerts.jks] but was unable to. Make sure that it is available on the classpath, or if not, that you have specified a valid URI

下面是我为 spark 尝试过的代码片段

df_delta=sqlContext.table('my_db.stagging_data')
status=df_delta.rdd.map(lambda row:(None,row.asDict())).saveAsNewAPIHadoopFile(path='-', outputFormatClass="org.elasticsearch.hadoop.mr.EsOutputFormat",keyClass="org.apache.hadoop.io.NullWritable",valueClass="org.elasticsearch.hadoop.mr.LinkedMapWritable",conf={'es.resource' : 'index1/type1','es.index.auto.create':'true','es.nodes':'<vip_name>','es.port':'9200','es.net.http.auth.user':'<user>','es.net.http.auth.pass':'<pwd>','es.net.ssl':'true','es.net.ssl.truststore.location':'file:///<path>/mycerts.jks','es.net.ssl.truststore.pass':'<pwd>','es.mapping.id' : 'key_id'})

我正在使用以下命令调用 shell -

pyspark --jars <path>/elasticsearch-spark-20_2.11-6.2.2.jar --py-files <path>/mycerts.jks

下面我添加整个日志

Caused by: org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Expected to find keystore file at [file:///<path>/mycerts.jks] but was unable to. Make sure that it is available on the classpath, or if not, that you have specified a valid URI.
        at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.loadKeyStore(SSLSocketFactory.java:193)
        at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.loadTrustManagers(SSLSocketFactory.java:224)
        at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.createSSLContext(SSLSocketFactory.java:171)
        ... 31 more

Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1609)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1597)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1596)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1596)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:831)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1830)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1779)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1768)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:642)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2034)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2055)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:2087)
        at org.apache.spark.internal.io.SparkHadoopWriter$.write(SparkHadoopWriter.scala:78)
        ... 26 more
Caused by: org.apache.spark.SparkException: Task failed while writing rows
        at org.apache.spark.internal.io.SparkHadoopWriter$.org$apache$spark$internal$io$SparkHadoopWriter$$executeTask(SparkHadoopWriter.scala:155)
        at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:83)
        at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:78)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
        at org.apache.spark.scheduler.Task.run(Task.scala:109)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        ... 1 more
Caused by: org.elasticsearch.hadoop.EsHadoopIllegalStateException: Cannot initialize SSL - Expected to find keystore file at [file:///<path>/mycerts.jks] but was unable to. Make sure that it is available on the classpath, or if not, that you have specified a valid URI.
        at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.createSSLContext(SSLSocketFactory.java:173)
        at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.getSSLContext(SSLSocketFactory.java:158)
        at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.createSocket(SSLSocketFactory.java:127)
        at org.apache.commons.httpclient.HttpConnection.open(HttpConnection.java:707)
        at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:387)
        at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:171)
        at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:397)
        at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:323)
        at org.elasticsearch.hadoop.rest.commonshttp.CommonsHttpTransport.execute(CommonsHttpTransport.java:478)
        at org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:112)
        at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:380)
        at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:344)
        at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:348)
        at org.elasticsearch.hadoop.rest.RestClient.get(RestClient.java:158)
        at org.elasticsearch.hadoop.rest.RestClient.getHttpNodes(RestClient.java:115)
        at org.elasticsearch.hadoop.rest.InitializationUtils.discoverNodesIfNeeded(InitializationUtils.java:92)
        at org.elasticsearch.hadoop.rest.RestService.createWriter(RestService.java:579)
        at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.init(EsOutputFormat.java:173)
        at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.write(EsOutputFormat.java:149)
        at org.apache.spark.internal.io.HadoopMapReduceWriteConfigUtil.write(SparkHadoopWriter.scala:356)
        at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:130)
        at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:127)
        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1413)
        at org.apache.spark.internal.io.SparkHadoopWriter$.org$apache$spark$internal$io$SparkHadoopWriter$$executeTask(SparkHadoopWriter.scala:139)
        ... 8 more
Caused by: org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Expected to find keystore file at [file:///<path>/mycerts.jks] but was unable to. Make sure that it is available on the classpath, or if not, that you have specified a valid URI.
        at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.loadKeyStore(SSLSocketFactory.java:193)
        at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.loadTrustManagers(SSLSocketFactory.java:224)
        at org.elasticsearch.hadoop.rest.commonshttp.SSLSocketFactory.createSSLContext(SSLSocketFactory.java:171)
        ... 31 more

连接到 py-spark 后,我可以读取和打印 jks 文件。 无法解决此问题。 有人可以建议。

我认为您使用了错误的选项

对于 Python,您可以使用 spark-submit 的--py-files参数添加.py、.zip 或 .egg 文件与您的应用程序一起分发

相反,您想要--files

--files FILES: Comma-separated list of files to be placed in the working directory of each executor. File paths of these files in executors can be accessed via SparkFiles.get(fileName)

要将其放置在执行程序中的不同路径,则可以使用#分隔符

spark-submit ... --files mycerts.jks#/<path>/mycerts.jks

在代码中,您可以从SparkFiles.get("mycerts.jks")获取对路径的引用,它返回文件的绝对路径

根据您的堆栈消息,您的<path显然是错误的。 您在--jars--py-files之后的路径值应该是准确的,但可惜目前不是。

预计会在 [file:////mycerts.jks] 找到密钥库文件,但无法找到。 确保它在类路径上可用,或者如果不可用,则确保您指定了有效的 URI。

正如上面的 cricket_007 所指出的,您使用了不正确的选项 --py-files。 而是使用 --files 选项上传您的证书文件。

此外,这些文件不会上传到这些执行程序的本地文件系统中,而是上传到 HDFS 上。 因此,您在 spark 代码中为 cert 文件传递的路径也不正确,因为它指向本地文件系统。

'es.net.ssl.truststore.location':'file:///<path>/mycerts.jks'

您可以在 spark 提交命令中使用带文件选项的 # 分隔符来上传 HDFS 上指定的文件

spark-submit ... --files mycerts.jks#/<path>/mycerts.jks

然后在您的 spark 代码中使用相同的路径来访问该文件。

'es.net.ssl.truststore.location':'/<path>/mycerts.jks'

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM