简体   繁体   中英

Google App Engine NDB Queries and exceeding memory

I have a Google App Engine datastore kind called "Obj" and it has close to 500K entities in production. I'm trying to query just 50 Obj entities, but even though I'm setting the limit argument to 50, the query eventually throws the error "Exceeded soft private memory limit".

Would this have something to do with the use of ndb.GenericProperty in the query? The attribute "trashed_date", which is a datetime type, is not normally an attribute of Obj. I've also manually created the correct index for status and trashed_date. Should "trashed_date" always be a property of that model?

Below is the code I'm using, what can I do so when querying just 50 entities it doesn't exceed the memory limit?

q = Obj.query(
    Obj.status == 1,
    ndb.GenericProperty('trashed_date') < expire_date
)
results = q.fetch(50)

Please try this using q.iter() and a counter to limit it to 50. I had a similar problem with fetch() and fixed it using iter(). GAE is pretty strongly advising against fetch now. YMMV. HTH. -stevep

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM