简体   繁体   中英

Existing Postgres Database vs Solr

We have an app that uses postgres database, that has about 50 tables. Each table contains about 3 Million records (on average). The tables get updated with new data every now and than. Now, we want to implement search feature in our app. The search needs to be performed on one table at a time (no joins needed).

I've read about postgres full text support and that looks promising. But it seems that Solr is Super fast in comparison to it. Can I use my existing postgres database with Solr? If tables get updated would I need to re-index everything again?

It is definitely worth giving Solr a try. We moved many MySQL queries involving JOINs on multiple tables with sorting on different fields to Solr. We are very happy with Solr's search speed, sort speed, faceting capabilities and highly configurable text analysis/tokenization options.

If tables get updated would I need to re-index everything again?

No, you can run delta imports to only re-index your new and updated documents. See https://wiki.apache.org/solr/DataImportHandler .

Get started with https://lucene.apache.org/solr/4_1_0/tutorial.html and all the links in there.

Since nobody has leapt in, I'll answer.

I'm afraid it all depends. It depends on (at least)

  • how big the text is in each "document"
  • how flexible you want your searching to be
  • how much integration you need between database and text-search
  • how fast is fast enough
  • how much experience you have with both

When I've had a database that needs some text searching, I've just used PG's built-in options. If I didn't have superuser access to the db, or was already running a big Java setup then Solr might well have appealed.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM