简体   繁体   English

如何使用scrapy和pyinstaller在Windows 10中创建单个可执行文件?

[英]How to create a single executable file in windows 10 with scrapy and pyinstaller?

I have created a scrapy spider and successfully converted to windows executable using pyinstaller with disc folder.我创建了一个scrapy蜘蛛并使用带有光盘文件夹的pyinstaller成功转换为Windows可执行文件。

In order to do that, I have to make some slight changes in the scrapy site-packages and add those packages in the windows disc folder, it works perfectly,为了做到这一点,我必须在scrapy site-packages中做一些细微的改变,并将这些包添加到windows光盘文件夹中,它工作得很好,

How can I make this into a single exe with the commented scrapy packages from the disc folder?如何将其与光盘文件夹中注释的scrapy包一起制作成单个exe?

I have already tried with --OneFile command in pyinstaller, but it shows the scrapy error?我已经在 pyinstaller 中尝试过 --OneFile 命令,但它显示了scrapy错误?

Very similar issue discussed here: python scrapy conversion to exe file using pyinstaller这里讨论的问题非常相似: python scrapy conversion to exe file using pyinstaller

Initially I used auto-py-to-exe package (which is actually GUI for pyinstaller )最初我使用auto-py-to-exe包(实际上是pyinstaller GUI)
I added following line to auto-py-to-exe -> advanced settings -> hidden import :我在auto-py-to-exe -> advanced settings -> hidden import添加了以下行:

scrapy.spiderloader,scrapy.statscollectors,scrapy.logformatter,scrapy.extensions,scrapy.extensions.corestats,scrapy.extensions.corestats,scrapy.extensions.telnet,scrapy.extensions.memusage,scrapy.extensions.memdebug,scrapy.extensions.closespider,scrapy.extensions.feedexport,scrapy.extensions.logstats,scrapy.extensions.spiderstate,scrapy.extensions.throttle,scrapy.core.scheduler,scrapy.squeues,queuelib,scrapy.core.downloader,scrapy.downloadermiddlewares,scrapy.downloadermiddlewares.robotstxt,scrapy.downloadermiddlewares.httpauth,scrapy.downloadermiddlewares.downloadtimeout,scrapy.downloadermiddlewares.defaultheaders,scrapy.downloadermiddlewares.useragent,scrapy.downloadermiddlewares.retry,scrapy.downloadermiddlewares.ajaxcrawl,scrapy.downloadermiddlewares.redirect,scrapy.downloadermiddlewares.httpcompression,scrapy.downloadermiddlewares.redirect,scrapy.downloadermiddlewares.cookies,scrapy.downloadermiddlewares.httpproxy,scrapy.downloadermiddlewares.stats,scrapy.downloadermiddlewares.httpcache,scrapy.spidermiddlewares,scrapy.spidermiddlewares.httperror,scrapy.spidermiddlewares.offsite,scrapy.spidermiddlewares.referer,scrapy.spidermiddlewares.urllength,scrapy.spidermiddlewares.depth,scrapy.pipelines,scrapy.dupefilters,scrapy.core.downloader.handlers.datauri,scrapy.core.downloader.handlers.file,scrapy.core.downloader.handlers.http,scrapy.core.downloader.handlers.s3,scrapy.core.downloader.handlers.ftp,scrapy.core.downloader.webclient,scrapy.core.downloader.contextfactory

After that following command appeared in last text box (don't forget to change path to your script) :在最后一个文本框中出现以下命令之后(不要忘记更改脚本的路径):

pyinstaller -y -F --hidden-import scrapy.spiderloader --hidden-import scrapy.statscollectors --hidden-import scrapy.logformatter --hidden-import scrapy.extensions --hidden-import scrapy.extensions.corestats --hidden-import scrapy.extensions.corestats --hidden-import scrapy.extensions.telnet --hidden-import scrapy.extensions.memusage --hidden-import scrapy.extensions.memdebug --hidden-import scrapy.extensions.closespider --hidden-import scrapy.extensions.feedexport --hidden-import scrapy.extensions.logstats --hidden-import scrapy.extensions.spiderstate --hidden-import scrapy.extensions.throttle --hidden-import scrapy.core.scheduler --hidden-import scrapy.squeues --hidden-import queuelib --hidden-import scrapy.core.downloader --hidden-import scrapy.downloadermiddlewares --hidden-import scrapy.downloadermiddlewares.robotstxt --hidden-import scrapy.downloadermiddlewares.httpauth --hidden-import scrapy.downloadermiddlewares.downloadtimeout --hidden-import scrapy.downloadermiddlewares.defaultheaders --hidden-import scrapy.downloadermiddlewares.useragent --hidden-import scrapy.downloadermiddlewares.retry --hidden-import scrapy.downloadermiddlewares.ajaxcrawl --hidden-import scrapy.downloadermiddlewares.redirect --hidden-import scrapy.downloadermiddlewares.httpcompression --hidden-import scrapy.downloadermiddlewares.redirect --hidden-import scrapy.downloadermiddlewares.cookies --hidden-import scrapy.downloadermiddlewares.httpproxy --hidden-import scrapy.downloadermiddlewares.stats --hidden-import scrapy.downloadermiddlewares.httpcache --hidden-import scrapy.spidermiddlewares --hidden-import scrapy.spidermiddlewares.httperror --hidden-import scrapy.spidermiddlewares.offsite --hidden-import scrapy.spidermiddlewares.referer --hidden-import scrapy.spidermiddlewares.urllength --hidden-import scrapy.spidermiddlewares.depth --hidden-import scrapy.pipelines --hidden-import scrapy.dupefilters --hidden-import scrapy.core.downloader.handlers.datauri --hidden-import scrapy.core.downloader.handlers.file --hidden-import scrapy.core.downloader.handlers.http --hidden-import scrapy.core.downloader.handlers.s3 --hidden-import scrapy.core.downloader.handlers.ftp --hidden-import scrapy.core.downloader.webclient --hidden-import scrapy.core.downloader.contextfactory "C:/path/script.py"

If after this your command return: ImportError: No module named 'modulename' - add missing module to hidden imports and repeat this process with new extended hidden imports.如果在此之后您的命令返回: ImportError: No module named 'modulename' - 将丢失的模块添加到隐藏导入并使用新的扩展隐藏导入重复此过程。
(I repeated this procedure 48 times in order to receive working exe file (and receive list of sumbodules)!!) (我重复了这个过程 48 次以接收工作 exe 文件(并接收 sumbodules 列表)!!)

Fixed it by using --hidden imports in the spec file.通过在规范文件中使用--hidden 导入来修复它。 Pyinstaller doesn't support all second level module imports in scrapy. Pyinstaller 不支持scrapy 中的所有二级模块导入。

Run the pyinstaller command, Just update the spec file with below hidden import changes,运行 pyinstaller 命令,只需使用以下隐藏的导入更改更新规范文件,

hiddenimports=['scrapy.spiderloader','scrapy.statscollectors','scrapy.logformatter','scrapy.extensions','scrapy.extensions.logstats', 'scrapy.extensions.corestats','scrapy.extensions.memusage','scrapy.extensions.feedexport','scrapy.extensions.memdebug', 'scrapy.extensions.closespider','scrapy.extensions.throttle','scrapy.extensions.telnet','scrapy.extensions.spiderstate', 'scrapy.core.scheduler','scrapy.core.downloader','scrapy.downloadermiddlewares','scrapy.downloadermiddlewares.robotstxt', 'scrapy.downloadermiddlewares.httpauth','scrapy.downloadermiddlewares.downloadtimeout','scrapy.downloadermiddlewares.defaultheaders', 'scrapy.downloadermiddlewares.useragent','scrapy.downloadermiddlewares.retry','scrapy.core.downloader.handlers.http', 'scrapy.core.downloader.handlers.s3','scrapy.core.downloader.handlers.ftp','scrapy.core.downloader.handlers.datauri', 'scrapy.core.downloader.handlers.file','scrapy.downloadermiddlewares.ajaxcrawl','scrapy.core.downloader.contextfactory', 'scrapy.downloadermiddlewares.redirect','scrapy.downloadermiddlewares.httpcompression','scrapy.downloadermiddlewares.cookies', 'scrapy.downloadermiddlewares.httpproxy','scrapy.downloadermiddlewares.stats','scrapy.downloadermiddlewares.httpcache', 'scrapy.spidermiddlewares','scrapy.spidermiddlewares.httperror','scrapy.spidermiddlewares.offsite','scrapy.spidermiddlewares.referer', 'scrapy.spidermiddlewares.urllength','scrapy.spidermiddlewares.depth','scrapy.pipelines','scrapy.dupefilters','queuelib', 'scrapy.squeues',]

Fixed with 45 module import issues.修复了 45 个模块导入问题。 Using --onefile helps to run the scrapy project in a single executable.使用--onefile有助于在单个可执行文件中运行 scrapy 项目。 Hope anyone finds it useful.希望任何人都觉得它有用。

Make your scrapy spider a python script by following the updated docs !按照更新的文档使您的爬虫蜘蛛成为 python 脚本!

Follow the usual pyinstaller command to make the executable (Make sure you are running it from inside your scrapy project).按照通常的 pyinstaller 命令生成可执行文件(确保您是从 scrapy 项目中运行它)。

    pyinstaller --onefile filename.py

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 由pyinstaller制成的可执行文件在Windows 10中不兼容 - Executable made from pyinstaller is not compatible in windows 10 如何使用 pyinstaller 将多个 .py 文件构建为单个可执行文件? - How to build multiple .py files into a single executable file using pyinstaller? 使用 Pyinstaller 创建单个 Flask 可执行文件 - Creating Single Flask executable file with Pyinstaller pyinstaller exe文件可在其他计算机上执行 - Pyinstaller exe single file executable on different computer Pyinstaller Python 和 Matplotlib 版本在处理 Windows 10 可执行文件时导致 RuntimeError - Pyinstaller Python and Matplotlib Version causes RuntimeError while processing executable file Windows 10 如何在 Linux 上使用 pyinstaller 生成 Windows 可执行文件? - How to generate a Windows executable using pyinstaller on Linux? 如何使用celery任务创建单个文件可执行文件 - How to create a single file executable with celery tasks 如何在没有pyinstaller的情况下使用“ ./”创建python可执行文件 - How to create a python executable with “./” without pyinstaller 如何通过pyinstaller在可执行文件中包含json - How to include json in executable file by pyinstaller 如何将信息加载到 pyinstaller 可执行文件 - How to load information to a pyinstaller executable file
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM