简体   繁体   English

AttributeError: type object 'spacy.syntax.nn_parser.array' 没有属性 '__reduce_cython__' ,(向虚拟环境添加路径)

[英]AttributeError: type object 'spacy.syntax.nn_parser.array' has no attribute '__reduce_cython__' , (adding Paths to virtual environments)

Overall problem总体问题

I am working on a nlp project and want to use spacy.我正在做一个 nlp 项目并想使用 spacy。 But when trying to load the language for an nlp object, I keep running into an error:但是当尝试为 nlp 对象加载语言时,我一直遇到错误:

AttributeError: type object 'spacy.syntax.nn_parser.array' has no attribute '__reduce_cython__'

Code:代码:

    test = nlp('many people like laughing while they are running')
    for word in test:
        print(word.text,word.lemma)

I am not sure but the problem could have something to do with the virtual environment I am working with.我不确定,但问题可能与我正在使用的虚拟环境有关。 One solution I found suggested to "add the spaCy path to PYTHONPATH in virtualenv"我发现的一种解决方案建议“将 spaCy 路径添加到 virtualenv 中的 PYTHONPATH”

So my actual 2 Questions are: 1) Where do you think my problem is?所以我的实际 2 个问题是:1)你认为我的问题在哪里? 2) If you think the problem has something to do with the virtual enviroment, how do I add the spaCy path to PYTHONPATH in virtualenv? 2)如果您认为问题与虚拟环境有关,如何在virtualenv中将spaCy路径添加到PYTHONPATH?

Thank you in advance for the help预先感谢您的帮助

Background Info:背景资料:

I am a beginner so I don't know much about stack overflow, venvs and what information you need to understand my problem.我是初学者,所以我不太了解堆栈溢出、venvs 以及了解我的问题所需的信息。 This is what I can give you:这是我能给你的:

I am following this tutorial: https://github.com/bhargavvader/personal/tree/master/notebooks/text_analysis_tutorial我正在关注本教程: https : //github.com/bhargavvader/personal/tree/master/notebooks/text_analysis_tutorial

My environment:我的环境:

Operating System: Linux Mint 19.1 Cinnamon
Python Version Used: Python 3.7.1
spaCy Version Used: 2.1.3

I am using python through anaconda我正在通过 anaconda 使用 python

What I have done so far: of course I searched the internet for the error This is my error log:到目前为止我所做的:当然我在互联网上搜索了错误这是我的错误日志:

What I have done so far到目前为止我所做的

1)I uninstalled and reinstalled spicy 1)我卸载并重新安装了辣

2)I checked out the spacy files 2)我检查了spacy文件

How I understood this is the part in the error log where the mistake occurs?:我如何理解这是错误日志中发生错误的部分?:

----> 4 from .pipes import Tagger, DependencyParser, EntityRecognizer ----> 4 from .pipes import Tagger, DependencyParser, EntityRecognizer

So I looked though my spacy folder to check out the pipes script.所以我查看了我的 spacy 文件夹以查看管道脚本。 But couldn't find a point where the functions Tagger, DependencyParser and EntityRecognizer called for ' reduce_cython '但是找不到函数 Tagger、DependencyParser 和 EntityRecognizer 调用“ reduce_cython ”的点

1) I have searched the Error log on the internet: 1)我在互联网上搜索了错误日志:

To my understanding the similar questions that were asked did not help me in my problem:据我了解,提出的类似问题对我的问题没有帮助:

The only question that was similar to my problem is the following: https://github.com/explosion/spaCy/issues/2439与我的问题类似的唯一问题如下: https : //github.com/explosion/spaCy/issues/2439

Their solution was "adding spaCy path to PYTHONPATH in virtualenv"他们的解决方案是“在 virtualenv 中向 PYTHONPATH 添加 spaCy 路径”

So I searched how to add paths to cette python path and found: How do I add a path to PYTHONPATH in virtualenv所以我搜索了如何向 cette python 路径添加路径并发现: How do I add a path to PYTHONPATH in virtualenv

Yet I don't quite understand the answers.然而我不太明白答案。 And I am still not sure if that is even the problem.我仍然不确定这是否是问题所在。 So If you know the answer to my problem or could give me some guidance of how to continue figuring out this problem.所以如果你知道我的问题的答案,或者可以给我一些如何继续解决这个问题的指导。 I'd be relieved.我就放心了。

Further information:更多信息:

If it is of importance, when following the turtorial I mentioned earlier I did run into the problem of not being able to download the requirements.如果它很重要,那么在遵循我之前提到的教程时,我确实遇到了无法下载需求的问题。 This is what my termnial would give me:这是我的终端会给我的:

Could not open requirements file: [Errno 2] No such file or directory: 'REQUIREMENTS_1.txt'

I ignored it bc everything worked smoothly at first.我忽略了它,因为一开始一切都很顺利。

Error log错误日志

AttributeError Traceback (most recent call last) in ----> 1 nlp = spacy.load('en') 2 3 test = nlp('many people like laughing while they are running') 4 for word in test: 5 print(word.text,word.lemma) AttributeError Traceback (last last call last) in ----> 1 nlp = spacy.load('en') 2 3 test = nlp('many people like while they are running') 4 for word in test: 5 print (word.text,word.lemma)

~/anaconda3/lib/python3.7/site-packages/spacy/ init .py in load(name, **overrides) 13 from .glossary import explain 14 from .about import version ---> 15 from .errors import Errors, Warnings, deprecation_warning 16 from . ~/anaconda3/lib/python3.7/site-packages/spacy/ init .py in load(name, **overrides) 13 from .glossary import Explain 14 from .about import version ---> 15 from .errors import Errors , 警告, deprecation_warning 16 来自 . import util 17导入实用程序 17

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model(name, **overrides) 110 """ 111 if isinstance(path, basestring_): --> 112 return Path(path) 113 else: 114 return path ~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model(name, **overrides) 110 """ 111 if isinstance(path, basestring_): --> 112 return Path(path) 113 其他:114 返回路径

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_link(name, **overrides) 127 if Path(name).exists(): # path to model data directory 128 return load_model_from_path(Path(name), **overrides) --> 129 elif hasattr(name, "exists"): # Path or Path-like to model data 130 return load_model_from_path(name, **overrides) 131 raise IOError(Errors.E050.format(name=name)) ~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_link(name, **overrides) 127 if Path(name).exists(): # 模型数据目录的路径 128 return load_model_from_path(Path (name), **overrides) --> 129 elif hasattr(name, "exists"): # Path or Path-like to model data 130 return load_model_from_path(name, **overrides) 131 raise IOError(Errors.E050.format (姓名=姓名))

~/anaconda3/lib/python3.7/site-packages/spacy/data/en/ init .py in load(**overrides) 10 11 def load(**overrides): ---> 12 return load_model_from_init_py( file , **overrides) ~/anaconda3/lib/python3.7/site-packages/spacy/data/en/ init .py in load(**overrides) 10 11 def load(**overrides): ---> 12 return load_model_from_init_py( file , **覆盖)

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_init_py(init_file, **overrides) 171 def load_model_from_init_py(init_file, **overrides): 172 """Helper function to use in the load() method of a model package's --> 173 init .py. 174 175 init_file (unicode): Path to model's init .py, ie __file__ . ~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_init_py(init_file, **overrides) 171 def load_model_from_init_py(init_file, **overrides): 172 """Helper function to use in the load()模型包的方法 --> 173 init .py. 174 175 init_file (unicode):模型的init .py 的路径,即__file__

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_path(model_path, meta, **overrides) 141 return cls.load(**overrides) 142 --> 143 144 def load_model_from_package(name, **overrides): 145 """Load a model from an installed package.""" ~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_path(model_path, meta, **overrides) 141 return cls.load(**overrides) 142 --> 143 144 def load_model_from_package(name , **overrides): 145 """从已安装的包中加载模型。"""

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in get_lang_class(lang) 48 """ 49 global LANGUAGES ---> 50 return lang in LANGUAGES 51 52 ~/anaconda3/lib/python3.7/site-packages/spacy/util.py in get_lang_class(lang) 48 """ 49 global LANGUAGES ---> 50 return lang in LANGUAGES 51 52

~/anaconda3/lib/python3.7/importlib/ init .py in import_module(name, package) 125 break 126 level += 1 --> 127 return _bootstrap._gcd_import(name[level:], package, level) 128 129 ~/anaconda3/lib/python3.7/importlib/ init .py in import_module(name, package) 125 break 126 level += 1 --> 127 return _bootstrap._gcd_import(name[level:], package, level) 128 129

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _gcd_import(name, package, level) ~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _gcd_import(name, package, level)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _find_and_load(name, import_) ~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _find_and_load(name, import_)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_) ~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _load_unlocked(spec) ~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _load_unlocked(spec)

~/anaconda3/lib/python3.7/importlib/_bootstrap_external.py in exec_module(self, module) ~/anaconda3/lib/python3.7/importlib/_bootstrap_external.py in exec_module(self, module)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds) ~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)

~/anaconda3/lib/python3.7/site-packages/spacy/lang/en/ init .py in 13 from ..tokenizer_exceptions import BASE_EXCEPTIONS 14 from ..norm_exceptions import BASE_NORMS ---> 15 from ...language import Language 16 from ...attrs import LANG, NORM 17 from ...util import update_exc, add_lookups ~/anaconda3/lib/python3.7/site-packages/spacy/lang/en/ init .py in 13 from ..tokenizer_exceptions import BASE_EXCEPTIONS 14 from ..norm_exceptions import BASE_NORMS ---> 15 from ...language import Language 16 from ...attrs import LANG, NORM 17 from ...util import update_exc, add_lookups

~/anaconda3/lib/python3.7/site-packages/spacy/language.py in 15 from .vocab import Vocab 16 from .lemmatizer import Lemmatizer ---> 17 from .pipeline import DependencyParser, Tensorizer, Tagger, EntityRecognizer 18 from .pipeline import SimilarityHook, TextCategorizer, Sentencizer 19 from .pipeline import merge_noun_chunks, merge_entities, merge_subtokens ~/anaconda3/lib/python3.7/site-packages/spacy/language.py in 15 from .vocab import Vocab 16 from .lemmatizer import Lemmatizer ---> 17 from .pipeline import DependencyParser, Tensorizer, Tagger, EntityRecognizer 18 from .pipeline import SimilarityHook、TextCategorizer、Sentencizer 19 from .pipeline import merge_noun_chunks、merge_entities、merge_subtokens

~/anaconda3/lib/python3.7/site-packages/spacy/pipeline/ init .py in 2 from future import unicode_literals 3 ----> 4 from .pipes import Tagger, DependencyParser, EntityRecognizer 5 from .pipes import TextCategorizer, Tensorizer, Pipe, Sentencizer 6 from .entityruler import EntityRuler ~/anaconda3/lib/python3.7/site-packages/spacy/pipeline/ init .py in 2 from future import unicode_literals 3 ----> 4 from .pipes import Tagger, DependencyParser, EntityRecognizer 5 from .pipes import TextCategorizer, Tensorizer, Pipe, Sentencizer 6 from .entityruler import EntityRuler

pipes.pyx in init spacy.pipeline.pipes() init spacy.pipeline.pipes() 中的pipes.pyx

~/anaconda3/lib/python3.7/site-packages/spacy/syntax/nn_parser.cpython-37m-x86_64-linux-gnu.so in init spacy.syntax.nn_parser() ~/anaconda3/lib/python3.7/site-packages/spacy/syntax/nn_parser.cpython-37m-x86_64-linux-gnu.so in init spacy.syntax.nn_parser()

AttributeError: type object 'spacy.syntax.nn_parser.array' has no attribute ' reduce_cython ' AttributeError:类型对象“spacy.syntax.nn_parser.array”没有属性“ reduce_cython

如果您在 Google Colab 上运行代码,请将运行时更改为 GPU,然后再次尝试安装 spacy。

Overall problem整体问题

I am working on a nlp project and want to use spacy.我正在做一个 nlp 项目并想使用 spacy。 But when trying to load the language for an nlp object, I keep running into an error:但是当尝试为 nlp 对象加载语言时,我一直遇到错误:

AttributeError: type object 'spacy.syntax.nn_parser.array' has no attribute '__reduce_cython__'

Code:代码:

    test = nlp('many people like laughing while they are running')
    for word in test:
        print(word.text,word.lemma)

I am not sure but the problem could have something to do with the virtual environment I am working with.我不确定,但问题可能与我正在使用的虚拟环境有关。 One solution I found suggested to "add the spaCy path to PYTHONPATH in virtualenv"我发现的一种解决方案建议“将 spaCy 路径添加到 virtualenv 中的 PYTHONPATH”

So my actual 2 Questions are: 1) Where do you think my problem is?所以我的实际 2 个问题是:1)你认为我的问题在哪里? 2) If you think the problem has something to do with the virtual enviroment, how do I add the spaCy path to PYTHONPATH in virtualenv? 2)如果您认为问题与虚拟环境有关,如何在virtualenv中将spaCy路径添加到PYTHONPATH?

Thank you in advance for the help预先感谢您的帮助

Background Info:背景资料:

I am a beginner so I don't know much about stack overflow, venvs and what information you need to understand my problem.我是初学者,所以我不太了解堆栈溢出、venvs 以及了解我的问题所需的信息。 This is what I can give you:这是我能给你的:

I am following this tutorial: https://github.com/bhargavvader/personal/tree/master/notebooks/text_analysis_tutorial我正在关注本教程: https : //github.com/bhargavvader/personal/tree/master/notebooks/text_analysis_tutorial

My environment:我的环境:

Operating System: Linux Mint 19.1 Cinnamon
Python Version Used: Python 3.7.1
spaCy Version Used: 2.1.3

I am using python through anaconda我正在通过 anaconda 使用 python

What I have done so far: of course I searched the internet for the error This is my error log:到目前为止我所做的:当然我在互联网上搜索了错误这是我的错误日志:

What I have done so far到目前为止我所做的

1)I uninstalled and reinstalled spicy 1)我卸载并重新安装了辣

2)I checked out the spacy files 2)我检查了spacy文件

How I understood this is the part in the error log where the mistake occurs?:我如何理解这是错误日志中发生错误的部分?:

----> 4 from .pipes import Tagger, DependencyParser, EntityRecognizer ----> 4 from .pipes import Tagger, DependencyParser, EntityRecognizer

So I looked though my spacy folder to check out the pipes script.所以我查看了我的 spacy 文件夹以查看管道脚本。 But couldn't find a point where the functions Tagger, DependencyParser and EntityRecognizer called for ' reduce_cython '但是找不到函数 Tagger、DependencyParser 和 EntityRecognizer 调用“ reduce_cython ”的点

1) I have searched the Error log on the internet: 1)我在互联网上搜索了错误日志:

To my understanding the similar questions that were asked did not help me in my problem:据我了解,提出的类似问题对我的问题没有帮助:

The only question that was similar to my problem is the following: https://github.com/explosion/spaCy/issues/2439与我的问题类似的唯一问题如下: https : //github.com/explosion/spaCy/issues/2439

Their solution was "adding spaCy path to PYTHONPATH in virtualenv"他们的解决方案是“在 virtualenv 中向 PYTHONPATH 添加 spaCy 路径”

So I searched how to add paths to cette python path and found: How do I add a path to PYTHONPATH in virtualenv所以我搜索了如何向 cette python 路径添加路径并发现: How do I add a path to PYTHONPATH in virtualenv

Yet I don't quite understand the answers.然而我不太明白答案。 And I am still not sure if that is even the problem.我仍然不确定这是否是问题所在。 So If you know the answer to my problem or could give me some guidance of how to continue figuring out this problem.所以如果你知道我的问题的答案,或者可以给我一些关于如何继续解决这个问题的指导。 I'd be relieved.我就放心了。

Further information:更多的信息:

If it is of importance, when following the turtorial I mentioned earlier I did run into the problem of not being able to download the requirements.如果它很重要,那么在遵循我之前提到的教程时,我确实遇到了无法下载需求的问题。 This is what my termnial would give me:这是我的终端会给我的:

Could not open requirements file: [Errno 2] No such file or directory: 'REQUIREMENTS_1.txt'

I ignored it bc everything worked smoothly at first.我忽略了它,因为一开始一切都很顺利。

Error log错误日志

AttributeError Traceback (most recent call last) in ----> 1 nlp = spacy.load('en') 2 3 test = nlp('many people like laughing while they are running') 4 for word in test: 5 print(word.text,word.lemma) AttributeError Traceback (last last call last) in ----> 1 nlp = spacy.load('en') 2 3 test = nlp('many people like while they are running') 4 for word in test: 5 print (word.text,word.lemma)

~/anaconda3/lib/python3.7/site-packages/spacy/ init .py in load(name, **overrides) 13 from .glossary import explain 14 from .about import version ---> 15 from .errors import Errors, Warnings, deprecation_warning 16 from . ~/anaconda3/lib/python3.7/site-packages/spacy/ init .py in load(name, **overrides) 13 from .glossary import Explain 14 from .about import version ---> 15 from .errors import Errors , 警告, deprecation_warning 16 来自 . import util 17导入实用程序 17

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model(name, **overrides) 110 """ 111 if isinstance(path, basestring_): --> 112 return Path(path) 113 else: 114 return path ~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model(name, **overrides) 110 """ 111 if isinstance(path, basestring_): --> 112 return Path(path) 113 其他:114 返回路径

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_link(name, **overrides) 127 if Path(name).exists(): # path to model data directory 128 return load_model_from_path(Path(name), **overrides) --> 129 elif hasattr(name, "exists"): # Path or Path-like to model data 130 return load_model_from_path(name, **overrides) 131 raise IOError(Errors.E050.format(name=name)) ~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_link(name, **overrides) 127 if Path(name).exists(): # 模型数据目录的路径 128 return load_model_from_path(Path (name), **overrides) --> 129 elif hasattr(name, "exists"): # Path or Path-like to model data 130 return load_model_from_path(name, **overrides) 131 raise IOError(Errors.E050.format (姓名=姓名))

~/anaconda3/lib/python3.7/site-packages/spacy/data/en/ init .py in load(**overrides) 10 11 def load(**overrides): ---> 12 return load_model_from_init_py( file , **overrides) ~/anaconda3/lib/python3.7/site-packages/spacy/data/en/ init .py in load(**overrides) 10 11 def load(**overrides): ---> 12 return load_model_from_init_py( file , **覆盖)

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_init_py(init_file, **overrides) 171 def load_model_from_init_py(init_file, **overrides): 172 """Helper function to use in the load() method of a model package's --> 173 init .py. 174 175 init_file (unicode): Path to model's init .py, ie __file__ . ~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_init_py(init_file, **overrides) 171 def load_model_from_init_py(init_file, **overrides): 172 """Helper function to use in the load()模型包的方法 --> 173 init .py. 174 175 init_file (unicode):模型的init .py 的路径,即__file__

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_path(model_path, meta, **overrides) 141 return cls.load(**overrides) 142 --> 143 144 def load_model_from_package(name, **overrides): 145 """Load a model from an installed package.""" ~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_path(model_path, meta, **overrides) 141 return cls.load(**overrides) 142 --> 143 144 def load_model_from_package(name , **overrides): 145 """从已安装的包中加载模型。"""

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in get_lang_class(lang) 48 """ 49 global LANGUAGES ---> 50 return lang in LANGUAGES 51 52 ~/anaconda3/lib/python3.7/site-packages/spacy/util.py in get_lang_class(lang) 48 """ 49 global LANGUAGES ---> 50 return lang in LANGUAGES 51 52

~/anaconda3/lib/python3.7/importlib/ init .py in import_module(name, package) 125 break 126 level += 1 --> 127 return _bootstrap._gcd_import(name[level:], package, level) 128 129 ~/anaconda3/lib/python3.7/importlib/ init .py in import_module(name, package) 125 break 126 level += 1 --> 127 return _bootstrap._gcd_import(name[level:], package, level) 128 129

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _gcd_import(name, package, level) ~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _gcd_import(name, package, level)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _find_and_load(name, import_) ~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _find_and_load(name, import_)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_) ~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _load_unlocked(spec) ~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _load_unlocked(spec)

~/anaconda3/lib/python3.7/importlib/_bootstrap_external.py in exec_module(self, module) ~/anaconda3/lib/python3.7/importlib/_bootstrap_external.py in exec_module(self, module)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds) ~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)

~/anaconda3/lib/python3.7/site-packages/spacy/lang/en/ init .py in 13 from ..tokenizer_exceptions import BASE_EXCEPTIONS 14 from ..norm_exceptions import BASE_NORMS ---> 15 from ...language import Language 16 from ...attrs import LANG, NORM 17 from ...util import update_exc, add_lookups ~/anaconda3/lib/python3.7/site-packages/spacy/lang/en/ init .py in 13 from ..tokenizer_exceptions import BASE_EXCEPTIONS 14 from ..norm_exceptions import BASE_NORMS ---> 15 from ...language import Language 16 from ...attrs import LANG, NORM 17 from ...util import update_exc, add_lookups

~/anaconda3/lib/python3.7/site-packages/spacy/language.py in 15 from .vocab import Vocab 16 from .lemmatizer import Lemmatizer ---> 17 from .pipeline import DependencyParser, Tensorizer, Tagger, EntityRecognizer 18 from .pipeline import SimilarityHook, TextCategorizer, Sentencizer 19 from .pipeline import merge_noun_chunks, merge_entities, merge_subtokens ~/anaconda3/lib/python3.7/site-packages/spacy/language.py in 15 from .vocab import Vocab 16 from .lemmatizer import Lemmatizer ---> 17 from .pipeline import DependencyParser, Tensorizer, Tagger, EntityRecognizer 18 from .pipeline import SimilarityHook、TextCategorizer、Sentencizer 19 from .pipeline import merge_noun_chunks、merge_entities、merge_subtokens

~/anaconda3/lib/python3.7/site-packages/spacy/pipeline/ init .py in 2 from future import unicode_literals 3 ----> 4 from .pipes import Tagger, DependencyParser, EntityRecognizer 5 from .pipes import TextCategorizer, Tensorizer, Pipe, Sentencizer 6 from .entityruler import EntityRuler ~/anaconda3/lib/python3.7/site-packages/spacy/pipeline/ init .py in 2 from future import unicode_literals 3 ----> 4 from .pipes import Tagger, DependencyParser, EntityRecognizer 5 from .pipes import TextCategorizer, Tensorizer, Pipe, Sentencizer 6 from .entityruler import EntityRuler

pipes.pyx in init spacy.pipeline.pipes() init spacy.pipeline.pipes() 中的pipes.pyx

~/anaconda3/lib/python3.7/site-packages/spacy/syntax/nn_parser.cpython-37m-x86_64-linux-gnu.so in init spacy.syntax.nn_parser() ~/anaconda3/lib/python3.7/site-packages/spacy/syntax/nn_parser.cpython-37m-x86_64-linux-gnu.so in init spacy.syntax.nn_parser()

AttributeError: type object 'spacy.syntax.nn_parser.array' has no attribute ' reduce_cython ' AttributeError:类型对象“spacy.syntax.nn_parser.array”没有属性“ reduce_cython

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 AttributeError:类型对象“sklearn.manifold._barnes_hut_tsne.array”没有属性“__reduce_cython__” - AttributeError: type object 'sklearn.manifold._barnes_hut_tsne.array' has no attribute '__reduce_cython__' AttributeError:类型对象“sklearn.tree._criterion.array”没有属性“__reduce_cython__” - AttributeError: type object 'sklearn.tree._criterion.array' has no attribute '__reduce_cython__' AttributeError: type object 'h5py.h5r.Reference' 没有属性 '__reduce_cython__' - AttributeError: type object 'h5py.h5r.Reference' has no attribute '__reduce_cython__' AttributeError:类型 object 'netCDF4._netCDF4._MemBuf' 没有属性 '__reduce_cython__' - AttributeError: type object 'netCDF4._netCDF4._MemBuf' has no attribute '__reduce_cython__' AttributeError: 类型对象 'sklearn.tree._tree.TreeBuilder' 没有属性 '__reduce_cython__' - AttributeError: type object 'sklearn.tree._tree.TreeBuilder' has no attribute '__reduce_cython__' AttributeError:类型 object 'pandana.cyaccess.cyaccess' 没有属性 '__reduce_cython__' - AttributeError: type object 'pandana.cyaccess.cyaccess' has no attribute '__reduce_cython__' 导入 modin.pandas 导致错误:AttributeError:类型 object 'pyarrow.lib.Message' 没有属性 '__reduce_cython__' - import modin.pandas causes ERROR: AttributeError: type object 'pyarrow.lib.Message' has no attribute '__reduce_cython__' “gensim._matutils.array”没有属性“__reduce_cython__” - 'gensim._matutils.array' has no attribute '__reduce_cython__' AttributeError:类型对象“ h5py.h5r.Reference”在使用“从keras.utils导入HDF5Matrix”时没有属性“ __reduce_cython__” - AttributeError: type object 'h5py.h5r.Reference' has no attribute '__reduce_cython__' on using “from keras.utils import HDF5Matrix” Cython AttributeError: 'module' 对象没有属性 'declare' - Cython AttributeError: 'module' object has no attribute 'declare'
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM