简体   繁体   English

在High Sierra中卸载java9

[英]uninstall java9 in High Sierra

I am trying to setup an environment for apache-spark and found out it is incompatible to Java9. 我正在尝试为apache-spark设置环境,发现它与Java9不兼容。 (Well I regret for not finding out this earlier.).I am not able to make spark work or java9 uninstalled. (很遗憾,我没有更早发现这一点。)。我无法使spark工作或卸载java9。

I tried both approaches here and here 我在这里这里都尝试了两种方法

None of these are yielding any results. 这些都没有产生任何结果。

If I run ' java --version ' in my terminal following is the output: 如果我在终端中运行“ java --version ”,则输出为:

java 9.0.4
Java(TM) SE Runtime Environment (build 9.0.4+11)
Java HotSpot(TM) 64-Bit Server VM (build 9.0.4+11, mixed mode)

My issue now is to uninstall Java9, reinstall Java8 and the reconfigure spark. 我现在的问题是卸载Java9,重新安装Java8和重新配置spark。

Any leads/help on this to this? 对此有任何线索/帮助吗?

Try using these commands. 尝试使用这些命令。 Go to: /Library/Java/JavaVirtualMachines 转到:/ Library / Java / JavaVirtualMachines

remove jdk-9.0.4.jdk folder 删除jdk-9.0.4.jdk文件夹

This should work for you 这应该为你工作

Spark is not compatible with Java 9 yet, you will need Hadoop 3.0 to use Java 9, but I have not seen a Spark with Hadoop 3.0 yet. Spark尚未与Java 9兼容,您将需要Hadoop 3.0才能使用Java 9,但我还没有看到带有Hadoop 3.0的Spark。 Your best option is to use a docker container that has Spark configured already. 最好的选择是使用已经配置了Spark的Docker容器。 I use this one: https://github.com/jupyter/docker-stacks/tree/master/pyspark-notebook , there are many more on docker hub. 我使用这个: https : //github.com/jupyter/docker-stacks/tree/master/pyspark-notebook,Docker Hub上还有更多内容。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM