简体   繁体   English

用scala-eclipse作为火花

[英]Using scala-eclipse for spark

Could some please help me on how to use the scala-eclipse IDE for spark ? 有些人可以请教我如何使用scala-eclipse IDE来获得火花吗? I came across this link - http://syndeticlogic.net/?p=311 . 我遇到了这个链接 - http://syndeticlogic.net/?p=311 But I am unable to follow it. 但我无法遵循它。 I entered the command - mvn -Phadoop2 eclipse:clean eclipse:eclipse inside the spark directory after a long list of downloads it gave me some error. 我输入命令--mvn -Phadoop2 eclipse:清理eclipse:在一长串下载后,我在spark目录里面的eclipse给了我一些错误。 Please help. 请帮忙。 Thanks 谢谢

Below is the error i received 以下是我收到的错误

Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM .......................... SUCCESS [5:22.386s]
[INFO] Spark Project Core ................................ SUCCESS [17:20.807s]
[INFO] Spark Project Bagel ............................... FAILURE [2.159s]
[INFO] Spark Project GraphX .............................. SKIPPED
[INFO] Spark Project ML Library .......................... SKIPPED
[INFO] Spark Project Streaming ........................... SKIPPED
[INFO] Spark Project Tools ............................... SKIPPED
[INFO] Spark Project Catalyst ............................ SKIPPED
[INFO] Spark Project SQL ................................. SKIPPED
[INFO] Spark Project Hive ................................ SKIPPED
[INFO] Spark Project REPL ................................ SKIPPED
[INFO] Spark Project Assembly ............................ SKIPPED
[INFO] Spark Project External Twitter .................... SKIPPED
[INFO] Spark Project External Kafka ...................... SKIPPED
[INFO] Spark Project External Flume ...................... SKIPPED
[INFO] Spark Project External ZeroMQ ..................... SKIPPED
[INFO] Spark Project External MQTT ....................... SKIPPED
[INFO] Spark Project Examples ............................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:15.115s
[INFO] Finished at: Wed May 07 15:27:51 GMT+05:30 2014
[INFO] Final Memory: 22M/81M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "hadoop2" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process (default) on project spark-bagel_2.10: Failed to resolve dependencies for one or more projects in the reactor. Reason: Missing:
[ERROR] ----------
[ERROR] 1) org.apache.spark:spark-core_2.10:jar:1.0.0-SNAPSHOT
[ERROR] 
[ERROR] Try downloading the file manually from the project website.
[ERROR] 
[ERROR] Then, install it using the command:
[ERROR] mvn install:install-file -DgroupId=org.apache.spark -DartifactId=spark-core_2.10 -Dversion=1.0.0-SNAPSHOT -Dpackaging=jar -Dfile=/path/to/file
[ERROR] 
[ERROR] Alternatively, if you host your own repository you can deploy the file there:
[ERROR] mvn deploy:deploy-file -DgroupId=org.apache.spark -DartifactId=spark-core_2.10 -Dversion=1.0.0-SNAPSHOT -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
[ERROR] 
[ERROR] Path to dependency:
[ERROR] 1) org.apache.spark:spark-bagel_2.10:jar:1.0.0-SNAPSHOT
[ERROR] 2) org.apache.spark:spark-core_2.10:jar:1.0.0-SNAPSHOT
[ERROR] 
[ERROR] ----------
[ERROR] 1 required artifact is missing.
[ERROR] 
[ERROR] for artifact:
[ERROR] org.apache.spark:spark-bagel_2.10:jar:1.0.0-SNAPSHOT
[ERROR] 
[ERROR] from the specified remote repositories:
[ERROR] maven-repo (http://repo.maven.apache.org/maven2, releases=true, snapshots=false),
[ERROR] apache-repo (https://repository.apache.org/content/repositories/releases, releases=true, snapshots=false),
[ERROR] jboss-repo (https://repository.jboss.org/nexus/content/repositories/releases, releases=true, snapshots=false),
[ERROR] mqtt-repo (https://repo.eclipse.org/content/repositories/paho-releases, releases=true, snapshots=false),
[ERROR] cloudera-repo (https://repository.cloudera.com/artifactory/cloudera-repos, releases=true, snapshots=false),
[ERROR] apache.snapshots (http://repository.apache.org/snapshots, releases=false, snapshots=true),
[ERROR] central (http://repo.maven.apache.org/maven2, releases=true, snapshots=false)
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :spark-bagel_2.10

This is because there is no profile called hadoop2 in the pom.xml. 这是因为pom.xml中没有名为hadoop2的配置文件。 The closest matches are hadoop-2.2,hadoop-2.3 etc. 最接近的比赛是hadoop-2.2,hadoop-2.3等。

You can run the following 您可以运行以下命令

mvn -Phadoop-2.2 eclipse:clean eclipse:eclipse

or you may run 'mvn help:all-profiles' to list all the profiles and use one from it 或者您可以运行'mvn help:all-profiles'列出所有配置文件并使用其中一个配置文件

If you want to contribute to the Apache Spark project, then 如果您想为Apache Spark项目做出贡献,那么

  • Go to spark home and run sbt/sbt eclipse 去火花回家并运行sbt / sbt eclipse
  • In Scala IDE, Select File | 在Scala IDE中,选择“文件”| Import | 导入| Existing Projects into Workspace. 现有项目进入工作区。
  • Select root directory :MY_SPARK_HOME 选择根目录:MY_SPARK_HOME
  • Select Search for Nested Projects 选择“搜索嵌套项目”
  • Select the projects that you want 选择所需的项目
  • Do not select "Copy projects into workspace". 不要选择“将项目复制到工作区”。

If you want to use the spark libraries in an application that you are using then, - You can create a jar using the sbt/sbt assembly command and then add that jar as a library to your application project 如果要在当前使用的应用程序中使用spark库, - 可以使用sbt / sbt assembly命令创建jar,然后将该jar作为库添加到应用程序项目中

Also refer to the eclipse documentation here at: https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-Eclipse 另请参阅此处的eclipse文档: https//cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-Eclipse

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM