Serializers

Serializers can be attached to backends in order to serialize/deserialize data sent and retrieved from the backend. This allows to apply transformations to data in case you want it to be saved in a specific format in your cache backend. For example, imagine you have your Model and want to serialize it to something that Redis can understand (Redis can’t store python objects). This is the task of a serializer.

To use a specific serializer:

>>> from aiocache import Cache
>>> from aiocache.serializers import PickleSerializer
cache = Cache(Cache.MEMORY, serializer=PickleSerializer())

Currently the following are built in:

NullSerializer

class aiocache.serializers.NullSerializer(*args, encoding=<object object>, **kwargs)[source]

This serializer does nothing. Its only recommended to be used by aiocache.SimpleMemoryCache because for other backends it will produce incompatible data unless you work only with str types because it store data as is.

DISCLAIMER: Be careful with mutable types and memory storage. The following behavior is considered normal (same as functools.lru_cache):

cache = Cache()
my_list = [1]
await cache.set("key", my_list)
my_list.append(2)
await cache.get("key")  # Will return [1, 2]
dumps(value)[source]

Returns the same value

loads(value)[source]

Returns the same value

StringSerializer

class aiocache.serializers.StringSerializer(*args, encoding=<object object>, **kwargs)[source]

Converts all input values to str. All return values are also str. Be careful because this means that if you store an int(1), you will get back ‘1’.

The transformation is done by just casting to str in the dumps method.

If you want to keep python types, use PickleSerializer. JsonSerializer may also be useful to keep type of symple python types.

dumps(value)[source]

Serialize the received value casting it to str.

Parameters:value – obj Anything support cast to str
Returns:str
loads(value)[source]

Returns value back without transformations

PickleSerializer

class aiocache.serializers.PickleSerializer(*args, protocol=3, **kwargs)[source]

Transform data to bytes using pickle.dumps and pickle.loads to retrieve it back.

dumps(value)[source]

Serialize the received value using pickle.dumps.

Parameters:value – obj
Returns:bytes
loads(value)[source]

Deserialize value using pickle.loads.

Parameters:value – bytes
Returns:obj

JsonSerializer

class aiocache.serializers.JsonSerializer(*args, encoding=<object object>, **kwargs)[source]

Transform data to json string with json.dumps and json.loads to retrieve it back. Check https://docs.python.org/3/library/json.html#py-to-json-table for how types are converted.

ujson will be used by default if available. Be careful with differences between built in json module and ujson:

  • ujson dumps supports bytes while json doesn’t
  • ujson and json outputs may differ sometimes
dumps(value)[source]

Serialize the received value using json.dumps.

Parameters:value – dict
Returns:str
loads(value)[source]

Deserialize value using json.loads.

Parameters:value – str
Returns:output of json.loads.

MsgPackSerializer

class aiocache.serializers.MsgPackSerializer(*args, use_list=True, **kwargs)[source]

Transform data to bytes using msgpack.dumps and msgpack.loads to retrieve it back. You need to have msgpack installed in order to be able to use this serializer.

Parameters:
  • encoding – str. Can be used to change encoding param for msg.loads method. Default is utf-8.
  • use_list – bool. Can be used to change use_list param for msgpack.loads method. Default is True.
dumps(value)[source]

Serialize the received value using msgpack.dumps.

Parameters:value – obj
Returns:bytes
loads(value)[source]

Deserialize value using msgpack.loads.

Parameters:value – bytes
Returns:obj

In case the current serializers are not covering your needs, you can always define your custom serializer as shown in examples/serializer_class.py:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
import asyncio
import zlib

from aiocache import Cache
from aiocache.serializers import BaseSerializer


class CompressionSerializer(BaseSerializer):

    # This is needed because zlib works with bytes.
    # this way the underlying backend knows how to
    # store/retrieve values
    DEFAULT_ENCODING = None

    def dumps(self, value):
        print("I've received:\n{}".format(value))
        compressed = zlib.compress(value.encode())
        print("But I'm storing:\n{}".format(compressed))
        return compressed

    def loads(self, value):
        print("I've retrieved:\n{}".format(value))
        decompressed = zlib.decompress(value).decode()
        print("But I'm returning:\n{}".format(decompressed))
        return decompressed


cache = Cache(Cache.REDIS, serializer=CompressionSerializer(), namespace="main")


async def serializer():
    text = (
        "Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt"
        "ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation"
        "ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in"
        "reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur"
        "sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit"
        "anim id est laborum.")
    await cache.set("key", text)
    print("-----------------------------------")
    real_value = await cache.get("key")
    compressed_value = await cache.raw("get", "main:key")
    assert len(compressed_value) < len(real_value.encode())


async def test_serializer():
    await serializer()
    await cache.delete("key")
    await cache.close()


if __name__ == "__main__":
    asyncio.run(test_serializer())

You can also use marshmallow as your serializer (examples/marshmallow_serializer_class.py):

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
import random
import string
import asyncio

from marshmallow import fields, Schema, post_load

from aiocache import Cache
from aiocache.serializers import BaseSerializer


class RandomModel:
    MY_CONSTANT = "CONSTANT"

    def __init__(self, int_type=None, str_type=None, dict_type=None, list_type=None):
        self.int_type = int_type or random.randint(1, 10)
        self.str_type = str_type or random.choice(string.ascii_lowercase)
        self.dict_type = dict_type or {}
        self.list_type = list_type or []

    def __eq__(self, obj):
        return self.__dict__ == obj.__dict__


class MarshmallowSerializer(Schema, BaseSerializer):  # type: ignore[misc]
    int_type = fields.Integer()
    str_type = fields.String()
    dict_type = fields.Dict()
    list_type = fields.List(fields.Integer())

    # marshmallow Schema class doesn't play nicely with multiple inheritance and won't call
    # BaseSerializer.__init__
    encoding = 'utf-8'

    @post_load
    def build_my_type(self, data, **kwargs):
        return RandomModel(**data)

    class Meta:
        strict = True


cache = Cache(serializer=MarshmallowSerializer(), namespace="main")


async def serializer():
    model = RandomModel()
    await cache.set("key", model)

    result = await cache.get("key")

    assert result.int_type == model.int_type
    assert result.str_type == model.str_type
    assert result.dict_type == model.dict_type
    assert result.list_type == model.list_type


async def test_serializer():
    await serializer()
    await cache.delete("key")


if __name__ == "__main__":
    asyncio.run(test_serializer())

By default cache backends assume they are working with str types. If your custom implementation transform data to bytes, you will need to set the class attribute encoding to None.