簡體   English   中英

如何使用gensim在Wikipedia頁面上訓練Word2Vec模型?

[英]How to train Word2Vec model on Wikipedia page using gensim?

閱讀本文之后 ,我將開始訓練自己的模型。 問題在於作者並不清楚Word2Vecsentences應該是什么樣。

我從Wikipedia頁面下載了文本,因為它是本文寫的內容,並從中列出句子列表:

sentences = [word for word in wikipage.content.split('.')]

因此,例如, sentences[0]看起來像:

'Machine learning is the subfield of computer science that gives computers the ability to learn without being explicitly programmed'

然后,我嘗試使用此列表訓練模型:

model = Word2Vec(sentences, min_count=2, size=50, window=10,  workers=4)

但是模型的字典由字母組成! 例如, model.wv.vocab.keys()的輸出為:

dict_keys([',', 'q', 'D', 'B', 'p', 't', 'o', '(', ')', '0', 'V', ':', 'j', 's', 'R', '{', 'g', '-', 'y', 'c', '9', 'I', '}', '1', 'M', ';', '`', '\n', 'i', 'r', 'a', 'm', '–', 'v', 'N', 'h', '/', 'P', 'F', '8', '"', '’', 'W', 'T', 'u', 'U', '?', ' ', 'n', '2', '=', 'w', 'C', 'O', '6', '&', 'd', '4', 'S', 'J', 'E', 'b', 'L', '$', 'l', 'e', 'H', '≈', 'f', 'A', "'", 'x', '\\', 'K', 'G', '3', '%', 'k', 'z'])

我究竟做錯了什么? 提前致謝!

使用nltk的標記化功能,可以將Word2Vec模型對象的輸入作為單詞列表的列表:

>>> import wikipedia
>>> from nltk import sent_tokenize, word_tokenize
>>> page = wikipedia.page('machine learning')
>>> sentences = [word_tokenize(sent) for sent in sent_tokenize(page.content)]
>>> sentences[0]
['Machine', 'learning', 'is', 'the', 'subfield', 'of', 'computer', 'science', 'that', 'gives', 'computers', 'the', 'ability', 'to', 'learn', 'without', 'being', 'explicitly', 'programmed', '.']

並輸入:

>>> from gensim.models import Word2Vec
>>> model = Word2Vec(sentences, min_count=2, size=50, window=10,  
>>> list(model.wv.vocab.keys())[:10]
['sparsely', '(', 'methods', 'their', 'typically', 'information', 'assessment', 'False', 'often', 'problems']

但總的來說,包含(單詞)生成器的(句子)生成器也可以工作,即:

>>> from gensim.utils import tokenize
>>> paragraphs = map(tokenize, page.content.split('\n')) # paragraphs
>>> model = Word2Vec(paragraphs, min_count=2, size=50, window=10,  workers=4)
>>> list(model.wv.vocab.keys())[:10]
['sparsely', 'methods', 'their', 'typically', 'information', 'assessment', 'False', 'often', 'problems', 'symptoms']

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM