[英]got unexpected EOF while looking for matching `"' error when running on mac
run_cmd="spark-submit \
$SPARK_OPTIONS \
--conf spark.hadoop.fs.default.name=file:/// \
--conf spark.hadoop.fs.defaultFS=file:/// \
--py-files \
${TARGET}/test.zip \
$TEST_PY \
$RAW_DATA_FILE \
$OUTPUT \
--route $AGG_OUTPUT1 \
--origin $AGG_OUTPUT2 \
--first $AGG_OUTPUT3" #line 71
echo $run_cmd
echo $run_cmd | bash
#line 75
The code is like above, it can run successfully on Ubuntu. 代码类似于上面,它可以在Ubuntu上成功运行。 However, when I run it on my macbook, the spark-submit
finishes normally and output is also generated correctly, but then it outputs an error, it really sounds unreasonable. 但是,当我在Macbook上运行它时, spark-submit
正常完成,并且输出也正确生成,但是随后输出错误,这听起来确实不合理。 Also, if spark-submit
exited abnormally, it won't trigger this error. 另外,如果spark-submit
异常退出,则不会触发此错误。
./test.sh: line 71: unexpected EOF while looking for matching `"'
./test.sh: line 75: syntax error: unexpected end of file
You haven't posted all the relevant code, only some lines near 60~75. 您尚未发布所有相关代码,只有一些行在60〜75附近。 The error you are getting happens when you have an unclosed "
somewhere before the posted code. For example: 当您在发布的代码之前的某个地方有一个未关闭的"
时,就会发生错误。例如:
a="
b="something"
If you run this script with bash
, it will report: 如果使用bash
运行此脚本,它将报告:
script.sh: line 3: unexpected EOF while looking for matching `"'
script.sh: line 4: syntax error: unexpected end of file
As in your case, the error is reported not on the line that didn't close the "
, but somewhere else. What happens, Bash interprets the value of a
as \\n\\nb=
, and then there is an opening "
after something
that's never closed. 在你的情况下,不报告了没有收线了错误的"
但在其他地方。会发生什么,Bash解释的价值a
作为\\n\\nb=
,再有是一个开放"
之后something
从来没有关闭过。
The same thing is happening in your code. 您的代码中发生了同样的事情。 Look for a "
that isn't properly closed earlier in your script. 查找在脚本前面未正确关闭的"
。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.