简体   繁体   English

Google App Engine高效的数据存储区读/写操作以节省配额

[英]Google App Engine Efficient Data store Read/Write Operation to Save Quota

I've created an Google App Engine Apps using Python.The application deals with lot of user names. 我使用Python创建了一个Google App Engine应用。该应用处理许多用户名。

It has got a database to 50K usernames. 它有一个数据库以50K用户名。 Each user name has a unique hash value. 每个用户名都有一个唯一的哈希值。 Which is also stored in the data store. 哪些也存储在数据存储中。

When any app user submit any user name. 任何应用程序用户提交任何用户名。 The application first checks if the username exist in DB. 应用程序首先检查用户名是否存在于数据库中。

If its a new user name, the application calculate a new hash for the new name and store the name and hash in DataStore. 如果应用程序使用新的用户名,则该应用程序将为新名称计算新的哈希,并将名称和哈希存储在DataStore中。

If user name already exist in Datastore, it retrieve the old hash from data store. 如果数据存储中已经存在用户名,它将从数据存储中检索旧哈希。

Sample Code: 样例代码:

class Names(db.Model):
    name = db.StringProperty(required=True)
    hash = db.StringProperty(required=True)

username = "debasish"
user_db = db.GqlQuery("SELECT * FROM Names WHERE name=:1", username)
user = user_db.get()
if user == None:
    #doesn't exist in DB..so calculate new hash for that name and store it in DB
    e = Names(name=username,hash="badasdbashdbhasbdasbdbjasbdjbasjdbasbdbasjdbjasbd")
    e.put()
else:
    #retrieve the old hash.
    self.response.out.write('{"name":"'+user.name+'","hash":"'+user.hash+'"}')            

The problem I'm facing is GAE's free data store read operation quota.Its exceeding too quickly and My application stop working. 我面临的问题是GAE的免费数据存储读取操作配额,该配额过快并且我的应用程序停止运行。

I've also tried to implement memcache,like this , adding entire db in memcache. 我也曾尝试实现memcache,像这样,在memcache中添加整个数据库。 But it was also a failure,result more bad. 但这也是一个失败,结果更加糟糕。

def get_fresh_all(self):
    all_names = db.GqlQuery("SELECT * FROM Names")
    memcache.add('full_db', all_names, 3600)
    return all_names

So,guys could you please suggest , Am I doing something wrong?? 所以,你们能不能建议我,我做错什么了吗? How I can make data store read operations more efficiently?? 如何使数据存储读取操作更有效?

Thanks in Adv. 感谢高级。

you can: 您可以:

  • switch to NDB where caching is automatic 切换到自动缓存的NDB
  • query the keys instead of entities SELECT __key__ FROM ... 查询键而不是实体SELECT __key__ FROM ...
  • reduce the related indexes (surely decreases the write ops, perhaps even read ops) 减少相关索引(确保减少写操作,甚至可能减少读操作)
  • rewrite all your entities with username as key_name and use the method get_or_insert() 使用用户名作为key_name重写所有实体,并使用方法get_or_insert()
user = Names.get_or_insert("debasish", hash="badasdbashdbhasbd")

You should cache only the username = hash instead of all. 您应该仅缓存用户名=哈希,而不是全部。 Plus add a in memory cache (this works per instance only cache. Should help more, just create a dict on global module level). 再加上一个内存缓存(这仅适用于每个实例缓存。应该有更多帮助,只需在全局模块级别上创建一个字典即可)。 This could grow really quickly depending on your unique hits but you can add a logic to only hold certain numbers. 根据您的独特点击数,它可能会迅速增长,但是您可以添加逻辑以仅保留某些数字。 Here is a sample: 这是一个示例:

cache = {}

def get_user_hash(username):
    if username in cache:
         return cache[username]
    hash = memcache.get(username)
    if not hash:
        hash = # retrieve from db
        if not hash:
            # put to db & assign hash=new_hash

        cache[username] = hash
        memcache.set(username, hash)
    return hash

@Faisal's method should work well, it adds two levels of caching to the query. @Faisal的方法应该可以正常工作,它为查询添加了两个缓存级别。

Another option is to store the username and hash in the session. 另一种选择是将用户名和哈希存储在会话中。 Only check the database once per session, and then retrieve the values from the session variables. 每个会话仅检查一次数据库,然后从会话变量中检索值。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Python Google App Engine 项目中发生内存泄漏。 任何有效的方法来编写我的操作? - Memory leak happened in Python Google App Engine project. Any efficient way to write my operation? Google App Engine NDB读取操作优化 - Google App Engine NDB Read Operation Optimization Google App Engine上的反向指数数据存储 - Inverted Indices Data Store on Google App Engine google app engine将大数据放入数据存储区的有效方法 - google app engine Efficient way to put large data in datastore 启动时使用Google App Engine在Google数据存储区中自动填充初始数据 - Google App Engine on startup autofill initial data in google data store App Engine - 将数据存储中API的响应保存为文件(blob) - App Engine - Save response from an API in the data store as file (blob) 如何通过谷歌应用引擎从谷歌表格中读取数据? - How to read data from google sheet through google app engine? 如何使用Google App Engine API获取带宽配额使用情况? - How to get bandwidth quota usage with Google app engine api? Google App Engine数据存储模型参考另一类 - Google App Engine Data Store Model Reference Another Class Google App Engine:为具有SQL背景的人员介绍他们的Data Store API? - Google App Engine: Intro to their Data Store API for people with SQL Background?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM