简体   繁体   中英

How Latent Semantic Analysis Handle Semantics

I have gone through LSA method. It is said that LSA can be used for semantic analysis. But I can not understand how it is working in LSA. Can anyone please tell me how LSA handle semantics.

Are you familiar with the vector space model (VSM)?

In LSA you can compute document similarity as well as type (ie word) similarity just as you would with the traditional VSM. That is, you compute the cosine between two type-vectors or two document-vectors (actually LSA allows you to compute also type-document similarity).

The problem with the VSM is that the cosine similarity of documents which do not share a single word equals to 0.

In LSA, the singular value decomposition (SVD) reveals latent semantic dimensions which allow you to compute the cosine similarity between documents with no words in common, but with some common characteristics.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM