[英]Google App Engine Efficient Data store Read/Write Operation to Save Quota
I've created an Google App Engine Apps using Python.The application deals with lot of user names. 我使用Python创建了一个Google App Engine应用。该应用处理许多用户名。
It has got a database to 50K usernames. 它有一个数据库以50K用户名。 Each user name has a unique hash value.
每个用户名都有一个唯一的哈希值。 Which is also stored in the data store.
哪些也存储在数据存储中。
When any app user submit any user name. 任何应用程序用户提交任何用户名。 The application first checks if the username exist in DB.
应用程序首先检查用户名是否存在于数据库中。
If its a new user name, the application calculate a new hash for the new name and store the name and hash in DataStore. 如果应用程序使用新的用户名,则该应用程序将为新名称计算新的哈希,并将名称和哈希存储在DataStore中。
If user name already exist in Datastore, it retrieve the old hash from data store. 如果数据存储中已经存在用户名,它将从数据存储中检索旧哈希。
Sample Code: 样例代码:
class Names(db.Model):
name = db.StringProperty(required=True)
hash = db.StringProperty(required=True)
username = "debasish"
user_db = db.GqlQuery("SELECT * FROM Names WHERE name=:1", username)
user = user_db.get()
if user == None:
#doesn't exist in DB..so calculate new hash for that name and store it in DB
e = Names(name=username,hash="badasdbashdbhasbdasbdbjasbdjbasjdbasbdbasjdbjasbd")
e.put()
else:
#retrieve the old hash.
self.response.out.write('{"name":"'+user.name+'","hash":"'+user.hash+'"}')
The problem I'm facing is GAE's free data store read operation quota.Its exceeding too quickly and My application stop working. 我面临的问题是GAE的免费数据存储读取操作配额,该配额过快并且我的应用程序停止运行。
I've also tried to implement memcache,like this , adding entire db in memcache. 我也曾尝试实现memcache,像这样,在memcache中添加整个数据库。 But it was also a failure,result more bad.
但这也是一个失败,结果更加糟糕。
def get_fresh_all(self):
all_names = db.GqlQuery("SELECT * FROM Names")
memcache.add('full_db', all_names, 3600)
return all_names
So,guys could you please suggest , Am I doing something wrong?? 所以,你们能不能建议我,我做错什么了吗? How I can make data store read operations more efficiently??
如何使数据存储读取操作更有效?
Thanks in Adv. 感谢高级。
you can: 您可以:
SELECT __key__ FROM
... SELECT __key__ FROM
... user = Names.get_or_insert("debasish", hash="badasdbashdbhasbd")
You should cache only the username = hash instead of all. 您应该仅缓存用户名=哈希,而不是全部。 Plus add a in memory cache (this works per instance only cache. Should help more, just create a dict on global module level).
再加上一个内存缓存(这仅适用于每个实例缓存。应该有更多帮助,只需在全局模块级别上创建一个字典即可)。 This could grow really quickly depending on your unique hits but you can add a logic to only hold certain numbers.
根据您的独特点击数,它可能会迅速增长,但是您可以添加逻辑以仅保留某些数字。 Here is a sample:
这是一个示例:
cache = {}
def get_user_hash(username):
if username in cache:
return cache[username]
hash = memcache.get(username)
if not hash:
hash = # retrieve from db
if not hash:
# put to db & assign hash=new_hash
cache[username] = hash
memcache.set(username, hash)
return hash
@Faisal's method should work well, it adds two levels of caching to the query. @Faisal的方法应该可以正常工作,它为查询添加了两个缓存级别。
Another option is to store the username and hash in the session. 另一种选择是将用户名和哈希存储在会话中。 Only check the database once per session, and then retrieve the values from the session variables.
每个会话仅检查一次数据库,然后从会话变量中检索值。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.