简体   繁体   中英

GCP Datastore vs Search API performance benchmarks?

Are there any existing benchmarks about GCP Datastore queries and Search queries performance?

I'm interested how the performance changes as the data grows. For instance, if we have:

class Project:
  members = ndb.StringProperty(repeated=True)

and we have document in Search like:

SearchDocument([AtomField(name=member, value='value...'), ...])

I want to run a search to get all project ids the user is member of. Something like:

ndb.query(keys_only=True).filter(Project.members == 'This Member')

in Datastore and similar query in the Search.

How would the performance compare when there are 10, 100, ... 16 * 6 objects?

I'm interested whether there is some rule of thumb about the latency we could expect for this simple kind of queries. Of course I can go and try that, but would like to get some intuitive idea about the performance I can expect beforehand, if someone had done similar benchmarks. Also, I would like to avoid spending $ and time on writing/reading data I would later need to delete, so if someone could share their experience, that would be much appreciated!

ps I use Python, but would assume the answer would be same/similar for all languages which have support for GCP.

Until this moment, Api Search is only supported for Python 2 , unfortunately this version of Python is no longer supported, so you should consider that you will not be able to receive support for this service.

On the other hand, take a look at the code provided in this thread, it can give you an idea of how to perform a benchmark test for Datastore using python 3.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM