简体   繁体   中英

Report Reindex taking too long after destroy

I have an product report that lists all products registered. When i destroy(delete) one of items from de product list i need the item to be removed from the report list. I do use Sunspot Solr with Mysql. I tried the following way:

after_destroy { ProductsReport.reindex; Sunspot.commit }

But because of my gigantic list of products it takes too long to execute. Is that a simple or more performing way to do it?

By the way, my system is built in Ruby on Rails. Thanks in advance.

You may very well be able to optimize this operation, but the details of how to do it depend on your data model and your Solr setup. I also question whether a full reindex is needed on each delete. Can you just delete the Solr document for the deleted record?

Regardless, I recommend updating your search cluster asynchronously using a queueing service. Popular options for Rails apps include DelayedJob and Resque .

The previous answer is correct - instead of reindexing, you should only remove the document in question from Solr, There is no need to reindex all the documents if only one document has changed.

In Sunspot you do can do this with Sunspot.remove(Doc) .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM