简体   繁体   English

memcached可以有效处理多少个大数据?

[英]How large data can memcached handle efficiently?

How large values can I store and retrieve from memcached without degrading its performance? 我可以在不降低性能的情况下存储和从memcached检索多少值? I am using memcached with python-memcached in a django based web application. 我在基于django的Web应用程序中使用python-memcached和memcached。

Read this one: 阅读此内容:

https://groups.google.com/forum/?fromgroups=#!topic/memcached/IaMLUeOGxWk https://groups.google.com/forum/?fromgroups=#!topic/memcached/IaMLUeOGxWk

You should not "store" anything in memcached. 您不应将任何内容“存储”在memcached中。

Memcached is more or less only limited by available (free) memory in the number of servers you run it on. Memcached或多或少地受到运行它的服务器数量中可用(空闲)内存的限制。 The more memory, the more data fits, and since it uses fairly efficient in-memory indexes, you won't really see performance degrade in any significant way with more objects. 内存越多,数据容纳的越多,并且由于它使用了相当高效的内存索引,因此,使用更多对象实际上不会以任何明显的方式降低性能。

Remember though, it's a cache and there is no guarantee that you'll be able to retrieve what you put in. More memory will make memcached try to keep more data in memory, but there is no guarantee that it won't just throw data away even if memory is available if it somehow finds that a better idea. 不过请记住,它是一个缓存 ,并不能保证您能够检索到放入的内容。更多的内存将使memcached尝试将更多的数据保留在内存中,但是不能保证它不会仅仅抛出数据。即使有内存,如果它以某种方式找到了一个更好的主意,也将消失。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用Python / PyGreSQL,如何有效处理大型结果集? - Using Python/PyGreSQL, how can I efficiently handle a large result set? 如何使用django + react高效处理大型文件上传? - How to efficiently handle large file uploads with django+react? 如何有效地进行大排列? - How can I do large permutations efficiently? 如何使用 python 处理 memory 中的大数据? - How can I handle large data in memory using python? 如何从大型数据集中有效地获取指定 window 内数据的平均值? - How can I efficiently get the average of the data within the specified window from a large dataset? 如何有效地解码大量小的JSON数据块? - How to efficiently decode a large number of small JSON data chunks? 如何在pandas中有效地加入/合并/连接大数据框? - How to efficiently join/merge/concatenate large data frame in pandas? 如何在使用 python 抓取期间有效地解析大型列表数据? - How to efficiently parse large list data during scraping with python? 如何通过Java或Python高效地将大量数据写入Cassandra? - How to efficiently write large amounts of data to Cassandra via Java or Python? 如何有效地生成具有大量 bin 和数据的直方图 - How to efficiently produce a histogram with a large number of bins and data
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM