簡體   English   中英

Dockerfile無法運行cp命令在容器內移動文件

[英]Dockerfile cannot run cp command to move file inside container

嗨,我正在嘗試在容器內下載文件,並將此文件移至容器內的特定位置。

RUN wget https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-latest-hadoop2.jar 
RUN cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/
RUN cp /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf
RUN echo "spark.hadoop.google.cloud.auth.service.account.enable true" > /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf

但這失敗並顯示以下錯誤:

Step 44/46 : RUN cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/
 ---> Running in 8c81d9871377
cp: cannot create regular file '/opt/spark-2.2.1-bin-hadoop2.7/jars/': No such file or directory
The command '/bin/sh -c cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/' returned a non-zero code: 1

EDIT-1 錯誤截圖

我已經嘗試過提到的解決方案,現在出現以下錯誤:

卸下中間容器e885431017e8步驟43/44:復制/opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template /opt/spark-2.2.1-bin-hadoop2.7/conf /spark-defaults.conf lstat opt / spark-2.2.1-bin-hadoop2.7 / conf / spark-defaults.conf.template:沒有這樣的文件或目錄

您的容器中是否已有路徑/opt/spark-2.2.1-bin-hadoop2.7/jars/

如果沒有,請在cp命令前添加:

mkdir -p /opt/spark-2.2.1-bin-hadoop2.7/jars/

然后嘗試像這樣復制:

cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/gcs-connector-latest-hadoop2.jar

編輯后:

您運行mkdir並嘗試從中復制,因為該文件夾為空,所以該方法不起作用!

為什么不直接下載到該文件夾​​? 並使用COPY復制容器中的文件COPY

RUN wget https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-latest-hadoop2.jar -P /opt/spark-2.2.1-bin-hadoop2.7/jars/
COPY /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf

假設: /opt/spark-2.2.1-bin-hadoop2.7/jars/文件夾存在

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM