关于 Redis 的几种数据库设计方案的内存占用测试
最近在做一个项目,数据库使用的是 Redis。在设计数据结构时,不知道哪种实现是最优的,于是做了下测试。 测试环境如下: OS X10.8.3 Redis 2.6.12 Python 2.7.4 redis-py 2.7.2 hiredis 0.1.1 ujson 1.30 MessagePack 0.3.0 注意: 因为是拿 Python 测试的
最近在做一个项目,数据库使用的是 Redis。在设计数据结构时,不知道哪种实现是最优的,于是做了下测试。测试环境如下:
OS X10.8.3注意:
Redis 2.6.12
Python 2.7.4
redis-py 2.7.2
hiredis 0.1.1
ujson 1.30
MessagePack 0.3.0
- 因为是拿 Python 测试的,所以可能对其他语言并不完全适用。
- 使用的测试数据是特定的,可能对更小或更大的数据并不完全适用。
测试结果就不列出了,直接说结论吧。
- 最差的存储方式就是用一个 hash 来存储一个实体(即一条记录)。时间上比其他方案慢 1 ~ 2 倍,空间占用较大。
更重要的是拿出来的字段类型是字符串,还得自己转换类型。
唯一的好处就是可以单独操作一个字段。
- 使用 string 类型来存储也是不推荐的,不过稍好于前一种方式。在单个实体较小时,会暴露出 key 占用内存较多的缺点。
- 用一个 hash 来存储一个类型的所有实体(即一张表),在实现上比较简单,内存占用尚可。
- 用多个 hash 来存储一个类型的所有实体(即分表),在实现上稍微复杂点,但占用的内存最小。
如果单个字段值较小(缺省值是 64 字节),单个 hash 存储的字段数不多(缺省值是 512 个)时,会采用 hash zipmap 来存储,内存占用会显著减小。
单个 hash 存储的字段数建议为 2 的次方,例如 1024。略微超过这个值,会导致内存占用和延迟时间都增加。
Instagram 的工程师认为,使用 hash zipmap 时,最佳的字段数为 1000 左右。不过据我测试,基本都是随字段数增加而变慢,而内存占用从 128 直到 1024 的变化基本可以忽略。
- 存储为 JSON 格式是种不错的选择。对包含中文的内容来说,设置 ensure_ascii=False 可以节省大量内存。
ujson 比 json 性能好很多,后者在设置 ensure_ascii=False 后性能急剧下降。
- cPickle 比 ujson 的性能要差,不过支持更多类型(如 datetime)。
- MessagePack 比 ujson 有一点不太明显的性能优势,不过丧失了可读性,且取回 unicode 需要自己 decode。
号称比 Protocol Buffer 快 4 倍应该可以无视了,至少其 Python 库没有明显优势。
- 使用 zlib 压缩可以节省更多内存,不过性能变慢 1 ~ 2 倍。
最后附上测试代码:
# -*- coding: utf-8 -*- import cPickle import json import time import zlib import msgpack import redis import ujson class Timer: def __enter__(self): self.start = time.time() return self def __exit__(self, *args): self.end = time.time() self.interval = self.end - self.start def test(function): def wrapper(*args, **kwargs): args_list = [] if args: args_list.append(','.join((str(arg) for arg in args))) if kwargs: args_list.append(','.join('%s=%s' % (key, value) for key, value in kwargs.iteritems())) print 'call %s(%s):' % (function.func_name, ', '.join(args_list)) redis_client.flushall() print 'memory:', redis_client.info()['used_memory_human'] with Timer() as timer: result = function(*args, **kwargs) print 'time:', timer.interval print 'memory:', redis_client.info()['used_memory_human'] print return result return wrapper redis_client = redis.Redis() pipe = redis_client.pipeline(transaction=False) articles = [{ 'id': i, 'title': u'团结全世界正义力量痛击日本', 'content': u'近期日本社会有四种感觉极度高涨,即二战期间日本军国主义扩张战争的惨败在日本右翼势力内心留下的耻辱感;被美国长期占领和控制的压抑感;经济长期停滞不前的焦虑感;对中国快速崛起引发的失落感。为此,日本为了找到一个发泄口,对中国采取了一系列挑衅行为,我们不能听之任之。现在全国13亿人要万众一心,团结起来,拿出决心、意志和能力,果断实施对等反击。在这场反击日本右翼势力的反攻倒算中,中国不是孤立的,我们要团结全世界一切反法西斯战争的正义力量,痛击日本对国际正义的挑战。', 'source_text': u'环球时报', 'source_url': 'http://opinion.huanqiu.com/column/mjzl/2012-09/3174337.html', 'time': '2012-09-13 09:23', 'is_public': True } for i in xrange(10000)] @test def test_hash(): for article in articles: pipe.hmset('article:%d' % article['id'], article) pipe.execute() @test def test_json_hash(): for article in articles: pipe.hset('article', article['id'], json.dumps(article)) pipe.execute() @test def test_ujson_hash(): for article in articles: pipe.hset('article', article['id'], ujson.dumps(article)) pipe.execute() @test def test_ujson_string(): for article in articles: pipe.set('article:%d' % article['id'], ujson.dumps(article)) pipe.execute() @test def test_zlib_ujson_string(): for article in articles: pipe.set('article:%d' % article['id'], zlib.compress(ujson.dumps(article, ensure_ascii=False))) pipe.execute() @test def test_msgpack(): for article in articles: pipe.hset('article', article['id'], msgpack.packb(article)) pipe.execute() @test def test_pickle_string(): for article in articles: pipe.set('article:%d' % article['id'], cPickle.dumps(article)) pipe.execute() @test def test_json_without_ensure_ascii(): for article in articles: pipe.hset('article', article['id'], json.dumps(article, ensure_ascii=False)) pipe.execute() @test def test_ujson_without_ensure_ascii(): for article in articles: pipe.hset('article', article['id'], ujson.dumps(article, ensure_ascii=False)) pipe.execute() def test_ujson_shard_id(): @test def test_ujson_shard_id_of_size(size): for article in articles: article_id = article['id'] pipe.hset('article:%d' % (article_id / size), article_id % size, ujson.dumps(article, ensure_ascii=False)) pipe.execute() for size in (2, 4, 8, 10, 16, 32, 64, 100, 128, 256, 500, 512, 513, 1000, 1024, 1025, 2048, 4096, 8092): test_ujson_shard_id_of_size(size) test_ujson_shard_id_of_size(512) for key, value in sorted(globals().copy().iteritems(), key=lambda x:x[0]): if key.startswith('test_'): value()
原文地址:关于 Redis 的几种数据库设计方案的内存占用测试, 感谢原作者分享。

MySQL'sBLOBissuitableforstoringbinarydatawithinarelationaldatabase,whileNoSQLoptionslikeMongoDB,Redis,andCassandraofferflexible,scalablesolutionsforunstructureddata.BLOBissimplerbutcanslowdownperformancewithlargedata;NoSQLprovidesbetterscalabilityand

ToaddauserinMySQL,use:CREATEUSER'username'@'host'IDENTIFIEDBY'password';Here'showtodoitsecurely:1)Choosethehostcarefullytocontrolaccess.2)SetresourcelimitswithoptionslikeMAX_QUERIES_PER_HOUR.3)Usestrong,uniquepasswords.4)EnforceSSL/TLSconnectionswith

ToavoidcommonmistakeswithstringdatatypesinMySQL,understandstringtypenuances,choosetherighttype,andmanageencodingandcollationsettingseffectively.1)UseCHARforfixed-lengthstrings,VARCHARforvariable-length,andTEXT/BLOBforlargerdata.2)Setcorrectcharacters

MySQloffersechar, Varchar, text, Anddenumforstringdata.usecharforfixed-Lengthstrings, VarcharerForvariable-Length, text forlarger text, AndenumforenforcingdataAntegritywithaetofvalues.

Optimizing MySQLBLOB requests can be done through the following strategies: 1. Reduce the frequency of BLOB query, use independent requests or delay loading; 2. Select the appropriate BLOB type (such as TINYBLOB); 3. Separate the BLOB data into separate tables; 4. Compress the BLOB data at the application layer; 5. Index the BLOB metadata. These methods can effectively improve performance by combining monitoring, caching and data sharding in actual applications.

Mastering the method of adding MySQL users is crucial for database administrators and developers because it ensures the security and access control of the database. 1) Create a new user using the CREATEUSER command, 2) Assign permissions through the GRANT command, 3) Use FLUSHPRIVILEGES to ensure permissions take effect, 4) Regularly audit and clean user accounts to maintain performance and security.

ChooseCHARforfixed-lengthdata,VARCHARforvariable-lengthdata,andTEXTforlargetextfields.1)CHARisefficientforconsistent-lengthdatalikecodes.2)VARCHARsuitsvariable-lengthdatalikenames,balancingflexibilityandperformance.3)TEXTisidealforlargetextslikeartic

Best practices for handling string data types and indexes in MySQL include: 1) Selecting the appropriate string type, such as CHAR for fixed length, VARCHAR for variable length, and TEXT for large text; 2) Be cautious in indexing, avoid over-indexing, and create indexes for common queries; 3) Use prefix indexes and full-text indexes to optimize long string searches; 4) Regularly monitor and optimize indexes to keep indexes small and efficient. Through these methods, we can balance read and write performance and improve database efficiency.


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

MinGW - Minimalist GNU for Windows
This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.

Dreamweaver Mac version
Visual web development tools

MantisBT
Mantis is an easy-to-deploy web-based defect tracking tool designed to aid in product defect tracking. It requires PHP, MySQL and a web server. Check out our demo and hosting services.

WebStorm Mac version
Useful JavaScript development tools

Zend Studio 13.0.1
Powerful PHP integrated development environment
