简体   繁体   English

在 Windows 上运行 Spark-Shell

[英]Running Spark-Shell on Windows

I have downloaded spark, sbt, scala, and git onto my Windows computer.我已将 spark、sbt、scala 和 git 下载到我的 ZAEA23489CE3AA9B6406EBB28E0CDA43Z 计算机上。 When I try and run spark-shell in my command prompt, I get "Failed to find Spark assembly JAR. You need to build Spark with sbt\sbt assembly before running this program."当我尝试在命令提示符下运行 spark-shell 时,我得到“无法找到 Spark 程序集 JAR。在运行此程序之前,您需要使用 sbt\sbt 程序集构建 Spark。”

I tried to follow this guide: https://x86x64.wordpress.com/2015/04/29/installing-spark-on-windows/ ,but I don't have a build subfolder so I am not sure if that is the problem.我尝试遵循本指南: https://x86x64.wordpress.com/2015/04/29/installing-spark-on-windows/ ,但我没有构建子文件夹,所以我不确定这是否是问题。

Any help would be appreciated.任何帮助,将不胜感激。

That's an old guide for spark 1.3.这是 spark 1.3 的旧指南。

Please use this guide to set up spark on Windows.请使用本指南在 Windows 上设置 spark。

http://www.ics.uci.edu/~shantas/Install_Spark_on_Windows10.pdf http://www.ics.uci.edu/~shantas/Install_Spark_on_Windows10.pdf

This guide uses Maven and you are going to use sbt but nevertheless you will be able to execute spark-shell with this guide.本指南使用 Maven,您将使用 sbt,但您仍然可以使用本指南执行 spark-shell。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在 windows 上的 Git Bash 中设置 Spark-shell - Setting up Spark-shell in Git Bash on windows 在运行spark-shell之前编辑spark-env.sh的正确方法是什么? - What is the right way to edit spark-env.sh before running spark-shell? 从Windows启动Spark-Shell(Scala)并设置工作目录路径 - Launch Spark-Shell (Scala) from Windows and Set Working Directory Path 为什么spark-shell失败并且“'”“C:\\ Program”在Windows上被识别为“内部或外部命令”? - Why does spark-shell fail with “'”“C:\Program' is not recognized as an internal or external command” on Windows? 启动spark-shell时,什么是“已注册”警告消息? - What are “is already registered” WARN messages when launching spark-shell? spark-shell:系统找不到指定的路径 - spark-shell: The system cannot find the path specified 为什么spark-shell失败并且“文件名,目录名或卷标语法不正确。”? - Why does spark-shell fail with “The filename, directory name, or volume label syntax is incorrect.”? 安装“hadoop”后“pyspark”不起作用,但“spark-shell”仍然有效。 为什么? - "pyspark" doesn't work after installing "hadoop" but "spark-shell" still works. Why? 运行scala Windows火花 - running scala windows spark 在Windows 7上运行Apache Spark - Running Apache Spark on Windows 7
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM