[英]Scrapy crawl return ModuleNotFoundError: No module named '_lzma'
I am currently experiencing a problem when trying to run a scrapy crawl in my dedicated virtual environment.我目前在我的专用虚拟环境中尝试运行 scrapy 爬网时遇到问题。 Indeed it returns me the error quoted in the title of this topic.事实上,它返回了本主题标题中引用的错误。
So I start by checking the version of python.所以我首先检查 python 的版本。
And then creates the virtual work environment.然后创建虚拟工作环境。
Cheks if the directory is present.检查目录是否存在。
Take a look at pip list
just to be sure.请查看pip list
以确保安全。
I continue with pip install scrapy playwright
我继续pip install scrapy playwright
I finally check if scrapy works properly with scrapy bench
which return me:我最终检查 scrapy 是否与scrapy bench
一起正常工作,这返回了我:
So i first tried to install liblzma-dev and backports.lzma with所以我首先尝试安装 liblzma-dev 和 backports.lzma
sudo apt-get install liblzma-dev
pip install backports.lzma
Which still end up returning最终还是会回来
Does anyone have a solution to this problem please?请问有人有办法解决这个问题吗?
Ok so I don't know why but after completely uninstalling scrapy from my computer and reinstalling it only in the env it worked.好的,所以我不知道为什么,但是从我的计算机上完全卸载 scrapy 并仅在它工作的环境中重新安装它之后。
Knowing that I already had scrapy installed outside venv and it worked perfectly.知道我已经在 venv 之外安装了 scrapy 并且它运行良好。
edit:编辑:
Ok so after trying to recreate the bug, you will be happy to know that I succeeded:)好的,在尝试重新创建错误之后,您会很高兴知道我成功了:)
The only drawback is that unlike before, I can't fix it despite having reproduced the exact same steps:$唯一的缺点是,与以前不同,尽管重现了完全相同的步骤,但我无法修复它:$
Does anyone have an idea?有人有想法吗? Because I'm going in circles...因为我要绕圈子...
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.