简体   繁体   English

如何在Python 3.9 PySpark package 内升级一个jar 文件依赖?

[英]How upgrade a jar file dependency within the Python 3.9 PySpark package?

How can I upgrade a jar file within a python package?如何在 python package 中升级 jar 文件? I need to upgrade to the latest version of log4j within pyspark.我需要在pyspark内升级到最新版本的log4j。

/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/pyspark/jars/log4j-1.2.17.jar

I tried upgrading PySpark with the below since I have multiple Python versions installed.我尝试使用以下升级 PySpark,因为我安装了多个 Python 版本。 I am a novice with managing Python installations.我是管理 Python 安装的新手。 Any pointers?任何指针? Thanks.谢谢。

python3.9 -m pip install pyspark --upgrade

(On MacOS) (在 MacOS 上)

Currently, the latest version of pyspark (3.2.1 from 26th January 2022) ships with log4j-1.2.17.jar , ie it is directly bundled in the tar.gz that pip downloads and then extracts and installs.目前,最新版本的 pyspark(3.2.1 从 2022 年 1 月 26 日起)附带log4j-1.2.17.jar ,即直接捆绑在 pip 下载的tar.gz中,然后解压安装。 As such, it cannot be upgraded individually, ie there is no automated way.因此,它不能单独升级,即没有自动方式。

You might be able to simply replace the .jar file manually, but I would suspect that you might run into issues when the API of the newer version is different than the 1.2.17 one.也许可以简单地手动替换.jar文件,但我怀疑当较新版本的1.2.17与 1.2.17 不同时,您可能会遇到问题。 I would suspect that if a newer version was compatible, the devs of the package would probably have used it我怀疑如果更新版本兼容,package 的开发者可能会使用它

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM