![](/img/trans.png)
[英]pyspark tasks stuck on Airflow and Spark Standalone Cluster with Docker-compose
[英]deploying spark cluster through docker-compose in dock swarm mode
我正在尝试使用docker-compose
文件来部署我的Apache Spark集群,该文件在此处定义为跨docker swarm模式工作的机器之间。
我正在我docker stack deploy -c compose-file.yml spark_cluster
管理器机器上调用docker stack deploy -c compose-file.yml spark_cluster
来按定义部署我的服务,但是当我按下docker stack ps spark_cluster
时遇到以下情况:
ID NAME IMAGE NODE DESIRED STATE CURRENT STATE ERROR PORTS
iy255fvx5ub8 spark_cluster_master.1 sauloricci/docker-spark:latest manager-swarm Running Running 20 seconds ago
mrr6p9dmodh5 \_ spark_cluster_master.1 sauloricci/docker-spark:latest worker2-swarm Shutdown Rejected 35 seconds ago "invalid mount config for type�"
u1daipeekanv \_ spark_cluster_master.1 sauloricci/docker-spark:latest worker2-swarm Shutdown Rejected 40 seconds ago "invalid mount config for type�"
9yup3zxpk4ur \_ spark_cluster_master.1 sauloricci/docker-spark:latest worker2-swarm Shutdown Rejected 45 seconds ago "invalid mount config for type�"
is4dib7wmb61 \_ spark_cluster_master.1 sauloricci/docker-spark:latest worker1-swarm Shutdown Rejected 50 seconds ago "invalid mount config for type�"
y80py4s4hny8 spark_cluster_worker.1 sauloricci/docker-spark:latest manager-swarm Running Running about a minute ago
集群似乎只接受了在集群管理器节点上运行的服务,而拒绝了在工人集群节点上运行的服务。
我如何设法找到与此方案关联的日志? 如果是这种情况,我在哪里可以查看与此部署相关的日志? 我想知道错误列中消息的确切含义。
使用docker stack ps <STACK> --no-trunc
获取错误消息的全文。
https://docs.docker.com/engine/reference/commandline/stack_ps/
确保您的.yml文件不包含卷,因为每个服务器都有不同的本地文件夹,请为我工作
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.