Code Monkey home page Code Monkey logo

cachelib's Introduction

CacheLib

A collection of cache libraries in the same API interface. Extracted from Werkzeug.

Installing

Install and update using pip:

$ pip install -U cachelib

Donate

The Pallets organization develops and supports Flask and the libraries it uses. In order to grow the community of contributors and users, and allow the maintainers to devote more time to the projects, please donate today.

Links

cachelib's People

Contributors

andrewm89 avatar davidism avatar dependabot[bot] avatar esadek avatar etiennepelletier avatar hkcomori avatar jayvdb avatar lepture avatar northernsage avatar pablogamboa avatar pre-commit-ci[bot] avatar snjypl avatar wangsha avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cachelib's Issues

Better error message when passing None as key for FileSystemCaching

Accidentally passing None as the key results in the rather mysterious error message:
"UnboundLocalError: local variable 'bkey_hash' referenced before assignment"
thanks to the type check in _get_filename without an else branch.

A better way to handle this would be to raise an exception "Key must be string, received type [which ever type was received]" or something similar. Thank you!

python-cachelib-0.13.0 fails to build with Python 3.13: pytest.PytestUnraisableExceptionWarning: Exception ignored in PyMapping_HasKeyString(); consider using PyMapping_HasKeyStringWithError(), PyMapping_GetOptionalItemString() or PyMapping_GetItemString(): None

Some pytest output
+ /usr/bin/pytest -v -r s -k 'not Uwsgi and not DynamoDb and not MongoDb'
============================= test session starts ==============================
platform linux -- Python 3.13.0b2, pytest-7.4.3, pluggy-1.3.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /builddir/build/BUILD/python-cachelib-0.13.0-build/cachelib-0.13.0
configfile: setup.cfg
testpaths: tests
plugins: xprocess-1.0.2
collecting ... collected 160 items / 22 deselected / 1 skipped / 138 selected

tests/test_base_cache.py::TestBaseCache::test_get PASSED                 [  0%]
tests/test_base_cache.py::TestBaseCache::test_delete PASSED              [  1%]
tests/test_base_cache.py::TestBaseCache::test_get_many PASSED            [  2%]
tests/test_base_cache.py::TestBaseCache::test_get_dict PASSED            [  2%]
tests/test_base_cache.py::TestBaseCache::test_set PASSED                 [  3%]
tests/test_base_cache.py::TestBaseCache::test_add PASSED                 [  4%]
tests/test_base_cache.py::TestBaseCache::test_set_many PASSED            [  5%]
tests/test_base_cache.py::TestBaseCache::test_delete_many PASSED         [  5%]
tests/test_base_cache.py::TestBaseCache::test_has PASSED                 [  6%]
tests/test_base_cache.py::TestBaseCache::test_clear PASSED               [  7%]
tests/test_base_cache.py::TestBaseCache::test_inc PASSED                 [  7%]
tests/test_base_cache.py::TestBaseCache::test_dec PASSED                 [  8%]
tests/test_file_system_cache.py::TestFileSystemCache::test_has[FileSystemCache] PASSED [  9%]
tests/test_file_system_cache.py::TestFileSystemCache::test_has[CustomSerializerCache] PASSED [ 10%]
tests/test_file_system_cache.py::TestFileSystemCache::test_has[CustomHashingMethodCache] PASSED [ 10%]
tests/test_file_system_cache.py::TestFileSystemCache::test_has[CustomDefaultHashingMethodCache] PASSED [ 11%]
tests/test_file_system_cache.py::TestFileSystemCache::test_clear[FileSystemCache] PASSED [ 12%]
tests/test_file_system_cache.py::TestFileSystemCache::test_clear[CustomSerializerCache] PASSED [ 13%]
tests/test_file_system_cache.py::TestFileSystemCache::test_clear[CustomHashingMethodCache] PASSED [ 13%]
tests/test_file_system_cache.py::TestFileSystemCache::test_clear[CustomDefaultHashingMethodCache] PASSED [ 14%]
tests/test_file_system_cache.py::TestFileSystemCache::test_set_get[FileSystemCache] PASSED [ 15%]
tests/test_file_system_cache.py::TestFileSystemCache::test_set_get[CustomSerializerCache] PASSED [ 15%]
tests/test_file_system_cache.py::TestFileSystemCache::test_set_get[CustomHashingMethodCache] PASSED [ 16%]
tests/test_file_system_cache.py::TestFileSystemCache::test_set_get[CustomDefaultHashingMethodCache] PASSED [ 17%]
tests/test_file_system_cache.py::TestFileSystemCache::test_set_get_many[FileSystemCache] PASSED [ 18%]
tests/test_file_system_cache.py::TestFileSystemCache::test_set_get_many[CustomSerializerCache] PASSED [ 18%]
tests/test_file_system_cache.py::TestFileSystemCache::test_set_get_many[CustomHashingMethodCache] PASSED [ 19%]
tests/test_file_system_cache.py::TestFileSystemCache::test_set_get_many[CustomDefaultHashingMethodCache] PASSED [ 20%]
tests/test_file_system_cache.py::TestFileSystemCache::test_get_dict[FileSystemCache] PASSED [ 21%]
tests/test_file_system_cache.py::TestFileSystemCache::test_get_dict[CustomSerializerCache] PASSED [ 21%]
tests/test_file_system_cache.py::TestFileSystemCache::test_get_dict[CustomHashingMethodCache] PASSED [ 22%]
tests/test_file_system_cache.py::TestFileSystemCache::test_get_dict[CustomDefaultHashingMethodCache] PASSED [ 23%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete[FileSystemCache] PASSED [ 23%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete[CustomSerializerCache] PASSED [ 24%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete[CustomHashingMethodCache] PASSED [ 25%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete[CustomDefaultHashingMethodCache] PASSED [ 26%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete_many[FileSystemCache] PASSED [ 26%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete_many[CustomSerializerCache] PASSED [ 27%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete_many[CustomHashingMethodCache] PASSED [ 28%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete_many[CustomDefaultHashingMethodCache] PASSED [ 28%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete_many_ignore_errors[FileSystemCache] PASSED [ 29%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete_many_ignore_errors[CustomSerializerCache] PASSED [ 30%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete_many_ignore_errors[CustomHashingMethodCache] PASSED [ 31%]
tests/test_file_system_cache.py::TestFileSystemCache::test_delete_many_ignore_errors[CustomDefaultHashingMethodCache] PASSED [ 31%]
tests/test_file_system_cache.py::TestFileSystemCache::test_add[FileSystemCache] PASSED [ 32%]
tests/test_file_system_cache.py::TestFileSystemCache::test_add[CustomSerializerCache] PASSED [ 33%]
tests/test_file_system_cache.py::TestFileSystemCache::test_add[CustomHashingMethodCache] PASSED [ 34%]
tests/test_file_system_cache.py::TestFileSystemCache::test_add[CustomDefaultHashingMethodCache] PASSED [ 34%]
tests/test_file_system_cache.py::TestFileSystemCache::test_inc_dec[FileSystemCache] PASSED [ 35%]
tests/test_file_system_cache.py::TestFileSystemCache::test_inc_dec[CustomSerializerCache] PASSED [ 36%]
tests/test_file_system_cache.py::TestFileSystemCache::test_inc_dec[CustomHashingMethodCache] PASSED [ 36%]
tests/test_file_system_cache.py::TestFileSystemCache::test_inc_dec[CustomDefaultHashingMethodCache] PASSED [ 37%]
tests/test_file_system_cache.py::TestFileSystemCache::test_expiration[FileSystemCache] PASSED [ 38%]
tests/test_file_system_cache.py::TestFileSystemCache::test_expiration[CustomSerializerCache] PASSED [ 39%]
tests/test_file_system_cache.py::TestFileSystemCache::test_expiration[CustomHashingMethodCache] PASSED [ 39%]
tests/test_file_system_cache.py::TestFileSystemCache::test_expiration[CustomDefaultHashingMethodCache] PASSED [ 40%]
tests/test_file_system_cache.py::TestFileSystemCache::test_EOFError[FileSystemCache] PASSED [ 41%]
tests/test_file_system_cache.py::TestFileSystemCache::test_EOFError[CustomSerializerCache] PASSED [ 42%]
tests/test_file_system_cache.py::TestFileSystemCache::test_EOFError[CustomHashingMethodCache] PASSED [ 42%]
tests/test_file_system_cache.py::TestFileSystemCache::test_EOFError[CustomDefaultHashingMethodCache] PASSED [ 43%]
tests/test_file_system_cache.py::TestFileSystemCache::test_threshold[FileSystemCache] PASSED [ 44%]
tests/test_file_system_cache.py::TestFileSystemCache::test_threshold[CustomSerializerCache] PASSED [ 44%]
tests/test_file_system_cache.py::TestFileSystemCache::test_threshold[CustomHashingMethodCache] PASSED [ 45%]
tests/test_file_system_cache.py::TestFileSystemCache::test_threshold[CustomDefaultHashingMethodCache] PASSED [ 46%]
tests/test_file_system_cache.py::TestFileSystemCache::test_file_counting[FileSystemCache] PASSED [ 47%]
tests/test_file_system_cache.py::TestFileSystemCache::test_file_counting[CustomSerializerCache] PASSED [ 47%]
tests/test_file_system_cache.py::TestFileSystemCache::test_file_counting[CustomHashingMethodCache] PASSED [ 48%]
tests/test_file_system_cache.py::TestFileSystemCache::test_file_counting[CustomDefaultHashingMethodCache] PASSED [ 49%]
tests/test_file_system_cache.py::TestFileSystemCache::test_file_counting_on_override[FileSystemCache] PASSED [ 50%]
tests/test_file_system_cache.py::TestFileSystemCache::test_file_counting_on_override[CustomSerializerCache] PASSED [ 50%]
tests/test_file_system_cache.py::TestFileSystemCache::test_file_counting_on_override[CustomHashingMethodCache] PASSED [ 51%]
tests/test_file_system_cache.py::TestFileSystemCache::test_file_counting_on_override[CustomDefaultHashingMethodCache] PASSED [ 52%]
tests/test_file_system_cache.py::TestFileSystemCache::test_prune_old_entries[FileSystemCache] PASSED [ 52%]
tests/test_file_system_cache.py::TestFileSystemCache::test_prune_old_entries[CustomSerializerCache] PASSED [ 53%]
tests/test_file_system_cache.py::TestFileSystemCache::test_prune_old_entries[CustomHashingMethodCache] PASSED [ 54%]
tests/test_file_system_cache.py::TestFileSystemCache::test_prune_old_entries[CustomDefaultHashingMethodCache] PASSED [ 55%]
tests/test_interface_uniformity.py::TestInterfaceUniformity::test_types_have_all_base_methods ERROR [ 55%]
tests/test_memcached_cache.py::TestMemcachedCache::test_has FAILED       [ 56%]
tests/test_memcached_cache.py::TestMemcachedCache::test_clear FAILED     [ 57%]
tests/test_memcached_cache.py::TestMemcachedCache::test_set_get FAILED   [ 57%]
tests/test_memcached_cache.py::TestMemcachedCache::test_set_get_many FAILED [ 58%]
tests/test_memcached_cache.py::TestMemcachedCache::test_get_dict FAILED  [ 59%]
tests/test_memcached_cache.py::TestMemcachedCache::test_delete FAILED    [ 60%]
tests/test_memcached_cache.py::TestMemcachedCache::test_delete_many FAILED [ 60%]
tests/test_memcached_cache.py::TestMemcachedCache::test_delete_many_ignore_errors FAILED [ 61%]
tests/test_memcached_cache.py::TestMemcachedCache::test_add FAILED       [ 62%]
tests/test_memcached_cache.py::TestMemcachedCache::test_inc_dec FAILED   [ 63%]
tests/test_memcached_cache.py::TestMemcachedCache::test_expiration FAILED [ 63%]
tests/test_redis_cache.py::TestRedisCache::test_has[RedisCache] PASSED   [ 64%]
tests/test_redis_cache.py::TestRedisCache::test_has[CustomCache] PASSED  [ 65%]
tests/test_redis_cache.py::TestRedisCache::test_clear[RedisCache] PASSED [ 65%]
tests/test_redis_cache.py::TestRedisCache::test_clear[CustomCache] PASSED [ 66%]
tests/test_redis_cache.py::TestRedisCache::test_set_get[RedisCache] PASSED [ 67%]
tests/test_redis_cache.py::TestRedisCache::test_set_get[CustomCache] PASSED [ 68%]
tests/test_redis_cache.py::TestRedisCache::test_set_get_many[RedisCache] PASSED [ 68%]
tests/test_redis_cache.py::TestRedisCache::test_set_get_many[CustomCache] PASSED [ 69%]
tests/test_redis_cache.py::TestRedisCache::test_get_dict[RedisCache] PASSED [ 70%]
tests/test_redis_cache.py::TestRedisCache::test_get_dict[CustomCache] PASSED [ 71%]
tests/test_redis_cache.py::TestRedisCache::test_delete[RedisCache] PASSED [ 71%]
tests/test_redis_cache.py::TestRedisCache::test_delete[CustomCache] PASSED [ 72%]
tests/test_redis_cache.py::TestRedisCache::test_delete_many[RedisCache] PASSED [ 73%]
tests/test_redis_cache.py::TestRedisCache::test_delete_many[CustomCache] PASSED [ 73%]
tests/test_redis_cache.py::TestRedisCache::test_delete_many_ignore_errors[RedisCache] PASSED [ 74%]
tests/test_redis_cache.py::TestRedisCache::test_delete_many_ignore_errors[CustomCache] PASSED [ 75%]
tests/test_redis_cache.py::TestRedisCache::test_add[RedisCache] PASSED   [ 76%]
tests/test_redis_cache.py::TestRedisCache::test_add[CustomCache] PASSED  [ 76%]
tests/test_redis_cache.py::TestRedisCache::test_inc_dec[RedisCache] PASSED [ 77%]
tests/test_redis_cache.py::TestRedisCache::test_inc_dec[CustomCache] PASSED [ 78%]
tests/test_redis_cache.py::TestRedisCache::test_expiration[RedisCache] PASSED [ 78%]
tests/test_redis_cache.py::TestRedisCache::test_expiration[CustomCache] PASSED [ 79%]
tests/test_redis_cache.py::TestRedisCache::test_callable_key[RedisCache] PASSED [ 80%]
tests/test_redis_cache.py::TestRedisCache::test_callable_key[CustomCache] PASSED [ 81%]
tests/test_simple_cache.py::TestSimpleCache::test_clear[SimpleCache] PASSED [ 81%]
tests/test_simple_cache.py::TestSimpleCache::test_clear[CustomCache] PASSED [ 82%]
tests/test_simple_cache.py::TestSimpleCache::test_has[SimpleCache] PASSED [ 83%]
tests/test_simple_cache.py::TestSimpleCache::test_has[CustomCache] PASSED [ 84%]
tests/test_simple_cache.py::TestSimpleCache::test_set_get[SimpleCache] PASSED [ 84%]
tests/test_simple_cache.py::TestSimpleCache::test_set_get[CustomCache] PASSED [ 85%]
tests/test_simple_cache.py::TestSimpleCache::test_set_get_many[SimpleCache] PASSED [ 86%]
tests/test_simple_cache.py::TestSimpleCache::test_set_get_many[CustomCache] PASSED [ 86%]
tests/test_simple_cache.py::TestSimpleCache::test_get_dict[SimpleCache] PASSED [ 87%]
tests/test_simple_cache.py::TestSimpleCache::test_get_dict[CustomCache] PASSED [ 88%]
tests/test_simple_cache.py::TestSimpleCache::test_delete[SimpleCache] PASSED [ 89%]
tests/test_simple_cache.py::TestSimpleCache::test_delete[CustomCache] PASSED [ 89%]
tests/test_simple_cache.py::TestSimpleCache::test_delete_many[SimpleCache] PASSED [ 90%]
tests/test_simple_cache.py::TestSimpleCache::test_delete_many[CustomCache] PASSED [ 91%]
tests/test_simple_cache.py::TestSimpleCache::test_delete_many_ignore_errors[SimpleCache] PASSED [ 92%]
tests/test_simple_cache.py::TestSimpleCache::test_delete_many_ignore_errors[CustomCache] PASSED [ 92%]
tests/test_simple_cache.py::TestSimpleCache::test_add[SimpleCache] PASSED [ 93%]
tests/test_simple_cache.py::TestSimpleCache::test_add[CustomCache] PASSED [ 94%]
tests/test_simple_cache.py::TestSimpleCache::test_inc_dec[SimpleCache] PASSED [ 94%]
tests/test_simple_cache.py::TestSimpleCache::test_inc_dec[CustomCache] PASSED [ 95%]
tests/test_simple_cache.py::TestSimpleCache::test_expiration[SimpleCache] PASSED [ 96%]
tests/test_simple_cache.py::TestSimpleCache::test_expiration[CustomCache] PASSED [ 97%]
tests/test_simple_cache.py::TestSimpleCache::test_threshold[SimpleCache] PASSED [ 97%]
tests/test_simple_cache.py::TestSimpleCache::test_threshold[CustomCache] PASSED [ 98%]
tests/test_simple_cache.py::TestSimpleCache::test_prune_old_entries[SimpleCache] PASSED [ 99%]
tests/test_simple_cache.py::TestSimpleCache::test_prune_old_entries[CustomCache] PASSED [100%]

==================================== ERRORS ====================================
__ ERROR at setup of TestInterfaceUniformity.test_types_have_all_base_methods __

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7fb3569d1e40>
when = 'setup'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        """Call func, wrapping the result in a CallInfo.
    
        :param func:
            The function to call. Called without arguments.
        :param when:
            The phase in which the function is called.
        :param reraise:
            Exception or exceptions that shall propagate if raised by the
            function, instead of being wrapped in the CallInfo.
        """
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.13/site-packages/_pytest/runner.py:341: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.13/site-packages/_pytest/runner.py:262: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.13/site-packages/pluggy/_hooks.py:493: in __call__
    return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
/usr/lib/python3.13/site-packages/pluggy/_manager.py:115: in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
/usr/lib/python3.13/site-packages/_pytest/unraisableexception.py:83: in pytest_runtest_setup
    yield from unraisable_exception_runtest_hook()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

    def unraisable_exception_runtest_hook() -> Generator[None, None, None]:
        with catch_unraisable_exception() as cm:
            yield
            if cm.unraisable:
                if cm.unraisable.err_msg is not None:
                    err_msg = cm.unraisable.err_msg
                else:
                    err_msg = "Exception ignored in"
                msg = f"{err_msg}: {cm.unraisable.object!r}\n\n"
                msg += "".join(
                    traceback.format_exception(
                        cm.unraisable.exc_type,
                        cm.unraisable.exc_value,
                        cm.unraisable.exc_traceback,
                    )
                )
>               warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E               pytest.PytestUnraisableExceptionWarning: Exception ignored in PyMapping_HasKeyString(); consider using PyMapping_HasKeyStringWithError(), PyMapping_GetOptionalItemString() or PyMapping_GetItemString(): None
E               
E               Traceback (most recent call last):
E                 File "/usr/lib64/python3.13/site-packages/pylibmc/client.py", line 142, in __init__
E                   super().__init__(servers=translate_server_specs(servers),
E                   ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E                                    binary=binary,
E                                    ^^^^^^^^^^^^^^
E                                    username=username, password=password,
E                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E                                    behaviors=_behaviors_numeric(behaviors))
E                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E               TypeError: 'NoneType' object is not subscriptable

/usr/lib/python3.13/site-packages/_pytest/unraisableexception.py:78: PytestUnraisableExceptionWarning
---------------------------- Captured stdout setup -----------------------------
/builddir/build/BUILD/python-cachelib-0.13.0-build/cachelib-0.13.0/.pytest_cache/d/.xprocess/redis$ redis-server --port 6360
process 'redis' started pid=292
292:C 06 Jun 2024 21:25:21.994 # WARNING Memory overcommit must be enabled! Without it, a background save or replication may fail under low memory condition. Being disabled, it can also cause failures without low memory condition, see https://github.com/jemalloc/jemalloc/issues/1328. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.

292:C 06 Jun 2024 21:25:21.994 * oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo

292:C 06 Jun 2024 21:25:21.994 * Redis version=7.2.5, bits=64, commit=00000000, modified=0, pid=292, just started

292:C 06 Jun 2024 21:25:21.994 * Configuration loaded

292:M 06 Jun 2024 21:25:21.995 * monotonic clock: POSIX clock_gettime

292:M 06 Jun 2024 21:25:21.995 * Running mode=standalone, port=6360.

292:M 06 Jun 2024 21:25:21.995 * Server initialized

292:M 06 Jun 2024 21:25:21.995 * Ready to accept connections tcp

redis process startup detected
/builddir/build/BUILD/python-cachelib-0.13.0-build/cachelib-0.13.0/.pytest_cache/d/.xprocess/pylibmc$ memcached -vv
process 'pylibmc' started pid=298
slab class   1: chunk size        96 perslab   10922

slab class   2: chunk size       120 perslab    8738

slab class   3: chunk size       152 perslab    6898

slab class   4: chunk size       192 perslab    5461

slab class   5: chunk size       240 perslab    4369

slab class   6: chunk size       304 perslab    3449

slab class   7: chunk size       384 perslab    2730

slab class   8: chunk size       480 perslab    2184

slab class   9: chunk size       600 perslab    1747

slab class  10: chunk size       752 perslab    1394

slab class  11: chunk size       944 perslab    1110

slab class  12: chunk size      1184 perslab     885

slab class  13: chunk size      1480 perslab     708

slab class  14: chunk size      1856 perslab     564

slab class  15: chunk size      2320 perslab     451

slab class  16: chunk size      2904 perslab     361

slab class  17: chunk size      3632 perslab     288

slab class  18: chunk size      4544 perslab     230

slab class  19: chunk size      5680 perslab     184

slab class  20: chunk size      7104 perslab     147

slab class  21: chunk size      8880 perslab     118

slab class  22: chunk size     11104 perslab      94

slab class  23: chunk size     13880 perslab      75

slab class  24: chunk size     17352 perslab      60

slab class  25: chunk size     21696 perslab      48

slab class  26: chunk size     27120 perslab      38

slab class  27: chunk size     33904 perslab      30

slab class  28: chunk size     42384 perslab      24

slab class  29: chunk size     52984 perslab      19

slab class  30: chunk size     66232 perslab      15

slab class  31: chunk size     82792 perslab      12

slab class  32: chunk size    103496 perslab      10

slab class  33: chunk size    129376 perslab       8

slab class  34: chunk size    161720 perslab       6

slab class  35: chunk size    202152 perslab       5

slab class  36: chunk size    252696 perslab       4

slab class  37: chunk size    315872 perslab       3

slab class  38: chunk size    394840 perslab       2

slab class  39: chunk size    524288 perslab       2

<26 server listening (auto-negotiate)

pylibmc process startup detected
=================================== FAILURES ===================================
_________________________ TestMemcachedCache.test_has __________________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7fb35634b920>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], TResult]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: Optional[
            Union[Type[BaseException], Tuple[Type[BaseException], ...]]
        ] = None,
    ) -> "CallInfo[TResult]":
        """Call func, wrapping the result in a CallInfo.
    
        :param func:
            The function to call. Called without arguments.
        :param when:
            The phase in which the function is called.
        :param reraise:
            Exception or exceptions that shall propagate if raised by the
            function, instead of being wrapped in the CallInfo.
        """
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result: Optional[TResult] = func()

/usr/lib/python3.13/site-packages/_pytest/runner.py:341: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.13/site-packages/_pytest/runner.py:262: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/usr/lib/python3.13/site-packages/pluggy/_hooks.py:493: in __call__
    return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
/usr/lib/python3.13/site-packages/pluggy/_manager.py:115: in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
/usr/lib/python3.13/site-packages/_pytest/unraisableexception.py:88: in pytest_runtest_call
    yield from unraisable_exception_runtest_hook()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

    def unraisable_exception_runtest_hook() -> Generator[None, None, None]:
        with catch_unraisable_exception() as cm:
            yield
            if cm.unraisable:
                if cm.unraisable.err_msg is not None:
                    err_msg = cm.unraisable.err_msg
                else:
                    err_msg = "Exception ignored in"
                msg = f"{err_msg}: {cm.unraisable.object!r}\n\n"
                msg += "".join(
                    traceback.format_exception(
                        cm.unraisable.exc_type,
                        cm.unraisable.exc_value,
                        cm.unraisable.exc_traceback,
                    )
                )
>               warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
E               pytest.PytestUnraisableExceptionWarning: Exception ignored in PyMapping_HasKeyString(); consider using PyMapping_HasKeyStringWithError(), PyMapping_GetOptionalItemString() or PyMapping_GetItemString(): None
E               
E               Traceback (most recent call last):
E                 File "/usr/lib64/python3.13/site-packages/pylibmc/client.py", line 142, in __init__
E                   super().__init__(servers=translate_server_specs(servers),
E                   ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E                                    binary=binary,
E                                    ^^^^^^^^^^^^^^
E                                    username=username, password=password,
E                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E                                    behaviors=_behaviors_numeric(behaviors))
E                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E               TypeError: 'NoneType' object is not subscriptable

/usr/lib/python3.13/site-packages/_pytest/unraisableexception.py:78: PytestUnraisableExceptionWarning
---------------------------- Captured stdout setup -----------------------------
/builddir/build/BUILD/python-cachelib-0.13.0-build/cachelib-0.13.0/.pytest_cache/d/.xprocess/pylibmc$ memcached -vv
process 'pylibmc' started pid=318
slab class   1: chunk size        96 perslab   10922

slab class   2: chunk size       120 perslab    8738

slab class   3: chunk size       152 perslab    6898

slab class   4: chunk size       192 perslab    5461

slab class   5: chunk size       240 perslab    4369

slab class   6: chunk size       304 perslab    3449

slab class   7: chunk size       384 perslab    2730

slab class   8: chunk size       480 perslab    2184

slab class   9: chunk size       600 perslab    1747

slab class  10: chunk size       752 perslab    1394

slab class  11: chunk size       944 perslab    1110

slab class  12: chunk size      1184 perslab     885

slab class  13: chunk size      1480 perslab     708

slab class  14: chunk size      1856 perslab     564

slab class  15: chunk size      2320 perslab     451

slab class  16: chunk size      2904 perslab     361

slab class  17: chunk size      3632 perslab     288

slab class  18: chunk size      4544 perslab     230

slab class  19: chunk size      5680 perslab     184

slab class  20: chunk size      7104 perslab     147

slab class  21: chunk size      8880 perslab     118

slab class  22: chunk size     11104 perslab      94

slab class  23: chunk size     13880 perslab      75

slab class  24: chunk size     17352 perslab      60

slab class  25: chunk size     21696 perslab      48

slab class  26: chunk size     27120 perslab      38

slab class  27: chunk size     33904 perslab      30

slab class  28: chunk size     42384 perslab      24

slab class  29: chunk size     52984 perslab      19

slab class  30: chunk size     66232 perslab      15

slab class  31: chunk size     82792 perslab      12

slab class  32: chunk size    103496 perslab      10

slab class  33: chunk size    129376 perslab       8

slab class  34: chunk size    161720 perslab       6

slab class  35: chunk size    202152 perslab       5

slab class  36: chunk size    252696 perslab       4

slab class  37: chunk size    315872 perslab       3

slab class  38: chunk size    394840 perslab       2

slab class  39: chunk size    524288 perslab       2

<26 server listening (auto-negotiate)

pylibmc process startup detected

detailed build log: https://copr-be.cloud.fedoraproject.org/results/%40python/python3.13/fedora-rawhide-x86_64/07545359-python-cachelib/builder-live.log.gz

https://bugzilla.redhat.com/show_bug.cgi?id=2251780

Environment:

  • Python version: Python 3.13
  • CacheLib version: 0.13.0

Using set() of FileSystemCache raises errors on Ubuntu 20.04

Using set() of FileSystemCache raises errors on Ubuntu 20.04

WARNING:root:Exception raised while handling cache file '/home/site/wwwroot/deal_sourcing/flask_session/5651fc15999c50354843e09982ff80ed'
Traceback (most recent call last):
File "/antenv/lib/python3.8/site-packages/cachelib/file.py", line 238, in set
self._run_safely(os.replace, tmp, filename)
File "/antenv/lib/python3.8/site-packages/cachelib/file.py", line 299, in _run_safely
output = fn(*args, **kwargs)
FileNotFoundError: [Errno 2] No such file or directory: '/home/site/wwwroot/deal_sourcing/flask_session/tmp8qwf4_ww.__wz_cache' -> '/home/site/wwwroot/deal_sourcing/flask_session/5651fc15999c50354843e09982ff80ed'

We are running a Flask/Dash app as an Azure web service on Ubunutu 20.04, which uses MSAL and AAD to authenticate.
The Flask app repeatedly tries to re-authenticate, does not allow the user to navigate the app as desired.
The above errors appear in the Azure Application Logs.

Environment:

  • Python version: 3.8
  • CacheLib version: 0.6.0

Can mitigate the problem by editing _run_safely as follows (see the # lines):

def _run_safely(self, fn: _t.Callable, *args: _t.Any, **kwargs: _t.Any) -> _t.Any:
    """On Windows os.replace, os.chmod and open can yield
    permission errors if executed by two different processes."""
    # if platform.system() == "Windows":
    if True:
        output = None
        wait_step = 0.001
        max_sleep_time = 10.0
        total_sleep_time = 0.0

        while total_sleep_time < max_sleep_time:
            try:
                output = fn(*args, **kwargs)
            # except PermissionError:
            except OSError:
                sleep(wait_step)
                total_sleep_time += wait_step
                wait_step *= 2
            else:
                break
    else:
        output = fn(*args, **kwargs)

    return output

configurable serializer

I would like to be able to choose my serialization strategy when configuring caching. This would have benefits both in performance and in cache sharing.

Universal timestamp storage in FileSystemCache

FileSystemCache implies that a serializer is framed, but some serializers (like json) are not framed and don’t support multiple load/dump from single file descriptor, since it can't determine message borders. This complicates serializer integration.
https://github.com/pallets/cachelib/blob/3777a15cc01d55544bd63d6ffe7e680a823b58fa/src/cachelib/file.py#L231-L237
https://github.com/pallets/cachelib/blob/3777a15cc01d55544bd63d6ffe7e680a823b58fa/src/cachelib/file.py#L192-L195

Sure, framing proxy may be implemented, but this will lead to storage and processing overheads. Since other backends don’t use framed serialization, may be we could implement serialization in a universal (non framed) way?
For example, via struct:

with os.fdopen(fd, "wb") as f:
    f.write(struct.pack("I", timeout))
    self.serializer.dump(value, f)

with self._safe_stream_open(filename, "rb") as f: 
     pickle_time = struct.unpack("I", f.read(4))
     if pickle_time == 0 or pickle_time >= time(): 
         return self.serializer.load(f) 

struct also reduces storage overhead (4 bytes vs 17 bytes via pickle).
Another solution is to store metadata in extended attributes (xattr in linux/darwin or EA in windows). Modern filesystems (like xfs or ext4) have huge inode size (256 bytes by default), so 4 bytes won’t be a problem, but if we don’t fit into current inode, new inode will be created. Theoretically, this also should reduce file operations for expired keys since we work only with inodes, but additional research and benchmarks are required and I'm not sure about portability.

DynamoDb backend is polluting logging handlers

Hi guys,

could you please remove the logging.warning logs? It is using a basic configuration which automatically creates undesired logging handlers and causes duplicate logs in projects with their own logging. This issue was here before, but it was conditional, now it happens for everyone who does not have boto3 library installed.

Replication:

Simply import the library and see handlers in root logger.

Expectation:

There should be no handlers in root logger.

Environment:

  • Python version: 3.8.16
  • CacheLib version: 0.10.0

Thank you for considering fixing this.

File Cache gets EOFError

I'm seeing a case where sometimes I get:

  File "/.../site-packages/cachelib/file.py", line 147, in set
    self._prune()
  File "/.../site-packages/cachelib/file.py", line 96, in _prune
    expires = pickle.load(f)
EOFError: Ran out of input

Should this error be in the try/except block?

Fix readthedocs broken build

Creating this just for the sake of documentation, docs build is currently breaking. I'll fix it along with next release

Request: multi-process support

cachelib is for single process only now.
If multiple processes handle the same cache directory, errors occur in file count values.

Use github service containers in CI

Following #35, using service containers as an alternative to manually installing chachelib external dependencies in tests.yml workflow (e.g. memcached, redis, pylibmc headers) seems like an interesting option.

Some thoughts:

  • Two separate services will be necessary one for redis and one for memcached.
  • The pytext-xprocess fixtures should not start local instances of redis and memcached when running under CI since the containers will be up. One way of going about this is checking one of the many environment variables set by the CI and having xprocess start (or not) based on the result.
  • The github macOS does not have docker installed by default, so it will need to be set up.

File Cache gets EOFError

The same bug described with cachlib version 0.1.1 here: "#21", seems to appear also in version 0.2.0

I faced the same issue with 0.1.1:

[Tue Jul 06 09:57:39.680042 2021] [wsgi:error] [pid 172159] [client 127.0.0.1:38240] self._prune()
[Tue Jul 06 09:57:39.680048 2021] [wsgi:error] [pid 172159] [client 127.0.0.1:38240] File "/lib/python3.6/site-packages/flask_app/cachelib/file.py", line 96, in _prune
[Tue Jul 06 09:57:39.680053 2021] [wsgi:error] [pid 172159] [client 127.0.0.1:38240] expires = pickle.load(f)
[Tue Jul 06 09:57:39.680059 2021] [wsgi:error] [pid 172159] [client 127.0.0.1:38240] EOFError: Ran out of input

After replacing cachlib with the new version 0.2.0, this error appears again:

[Thu Jul 08 12:41:55.193613 2021] [wsgi:error] [pid 111282] [client 127.0.0.1:51574] self._prune()
[Thu Jul 08 12:41:55.193626 2021] [wsgi:error] [pid 111282] [client 127.0.0.1:51574] File "/lib/python3.6/site-packages/flask_app/cachelib/file.py", line 122, in _prune
[Thu Jul 08 12:41:55.193639 2021] [wsgi:error] [pid 111282] [client 127.0.0.1:51574] self._remove_expired(now)
[Thu Jul 08 12:41:55.193652 2021] [wsgi:error] [pid 111282] [client 127.0.0.1:51574] File "/lib/python3.6/site-packages/flask_app/cachelib/file.py", line 91, in _remove_expired
[Thu Jul 08 12:41:55.193666 2021] [wsgi:error] [pid 111282] [client 127.0.0.1:51574] expires = pickle.load(f)
[Thu Jul 08 12:41:55.193679 2021] [wsgi:error] [pid 111282] [client 127.0.0.1:51574] EOFError: Ran out of input

In both cases, the error is in picke.load(f)

Environment:

  • Python version: python3.6

Allow passing serializer as a class parameter.

Following the discussion in #11 and the improvement done in #63, where custom serializers were added for each cache backend. I think that it would be very nice to have a few generic serializers (Pickle, JSON, etc) that could be passed as a parameter when initialising the cache backends.

Something that would look like:

from cachelib import FileSystemCache, RedisCache
from cachelib.serializers import JsonSerializer, PickleSerializer

redis_cache = RedisCache(serializer=JsonSerializer)
file_cache = FileSystemCache(serializer=PickleSerializer)

Each backend cache would obviously have a default serializer for backwards compatibility.

This would allow using more secure serialization alternatives than Pickle.
The ultimate goal that I would like to achieve would be to be able to use a custom serializer with the Flask-Caching library.

I could try to work out a solution for this and submit a PR if you think this approach would make sense.

PermissionError [WinError 5] [Errno 13] with FileSystemCache on Windows 10

Using set() and get() of FileSystemCache raises exceptions on Windows 10 when called very fast.

PermissionError: [Errno 13] Permission denied ...
PermissionError: [WinError 5] Access is denied ...

There are no errors when running the same script on Ubuntu 20.04.

How to replicate (optional increase range):

import cachelib
import threading

fsc = cachelib.file.FileSystemCache('.')

def set_get(i):
    fsc.set('key', 'val')
    val = fsc.get('key')

for i in range(10):
    t = threading.Thread(target=set_get, args=(i,))
    t.start()

Randomly generates tracebacks like:

WARNING:root:Exception raised while handling cache file '.\3c6e0b8a9c15224a8228b9a98ca1531d'
Traceback (most recent call last):
File "C:\Users\User...\lib\site-packages\cachelib\file.py", line 183, in get
with open(filename, "rb") as f:
PermissionError: [Errno 13] Permission denied: '.\3c6e0b8a9c15224a8228b9a98ca1531d'

WARNING:root:Exception raised while handling cache file '.\3c6e0b8a9c15224a8228b9a98ca1531d'
Traceback (most recent call last):
File "C:\Users\User...\lib\site-packages\cachelib\file.py", line 228, in set
os.replace(tmp, filename)
PermissionError: [WinError 5] Access is denied: 'C:\Users\User\python3.8\venvs\imapclient\cachelib_test\tmpop5jfhp8.__wz_cache' -> '.\3c6e0b8a9c15224a8228b9a98ca1531d'

Expected behavior is no errors. I bounced into this with Flask-Session using Cachelib FileSystemCache on Windows 10 when sessions got lost/not updated. Maybe it has to do with the fact that Windows 10 is running in a VirtualBox(?). I did not try this on a native Windows 10 system. Maybe it has to do with the implementation of os.replace on Windows(?).

Environment:

  • CacheLib version: 0.4.1
  • Windows 10 Pro is running in VirtualBox on Ubuntu 20.04.
  • Python version: Python 3.8.10 (tags/v3.8.10:3d8993a, May 3 2021, 11:48:03) [MSC v.1928 64 bit (AMD64)] on win32.
  • The script is running in a virtual environment.

Pickled integer

Int values pickled in RedisCache
In Redis db it looks like hex string !\x80\x04\x95\x04\x00\x00\x00\x00\x00\x00\x00M\xC2z., not 31426

Environment:

  • Python version: 3.8
  • CacheLib version: 0.3.0

Add new function to fetch all existing cache entries

Function

When I use FileSystemCache, I could not find a good way to get all cache info and manage them.

Could please add func which get all cache info and manage them?

Suggestion About Achieving This Function

One Way To Achieving

    def get_all(self) -> _t.Any:
        infos = []
        for fname in self._list_dir():
            print(fname)
            try:
                with self._safe_stream_open(fname, "rb") as f:
                    pickle_time = struct.unpack("I", f.read(4))[0]
                    if pickle_time == 0 or pickle_time >= time():
                        infos.append((fname, self.serializer.load(f)))
            except FileNotFoundError:
                pass
            except (OSError, EOFError, struct.error):
                logging.warning(
                    "Exception raised while handling cache file '%s'",
                    fname,
                    exc_info=True,
                )
        return infos

    def remove_form_fname(self, fname: str) -> None:
        try:
            os.remove(fname)
            self._update_count(delta=-1)
        except FileNotFoundError:
            pass
        except (OSError, EOFError, struct.error):
            logging.warning(
                "Exception raised while handling cache file '%s'",
                fname,
                exc_info=True,
            )

It would be nice to provide functions for other cache management methods.

cachelib writing string values with odd encoding

I'm using cachelib from Python 3.7 to write simple string values into Redis, with the set() method.

from cachelib import RedisCache

cache = RedisCache()
cache.set('FirstName', 'Trevor')

Using the redis-cli, you can see the value when it's set from cachelib (first), versus set with the redis-cli itself (second).

127.0.0.1:6379> get FirstName
"!\x80\x03X\b\x00\x00\x00Trevorq\x00."
127.0.0.1:6379> set FirstName trevor
OK
127.0.0.1:6379> get FirstName
"trevor"

Is there an explanation for why cachelib is writing data in this format?

FileSystemCache warns about FileNotFoundError exception

FileSystemCache logs a warning with FileNotFoundError exception if working with files that do not exist.

To reproduce, simply call FileSystemCache.get on non-existing file (a new key).

I think there should be no such logs, because this is the most common case - when you work with cache, you always try to get the value first and set the value only if it does not exist. There can be a lot of such logs and they pollute the log.

Also, looking at the code, I don't think it affects only FileSystemCache.get. I would consider other places with these logs as well, because consider situation where two processes start pruning cached files and collide, trying to delete the same file twice.

Environment:

  • Python version: 3.8
  • CacheLib version: 0.3.0

Thank you for your opinion on this.

Tests Missing

The project currently has no tests. It would be good to have tests early on, as this would make for a safer and more robust development going forward.

Diverging API

Quotting from #48

An interesting thing is how the api "uniformity" is reflected in BaseCache as type hints are added to the codebase.

For example:

# BaseCache

def delete_many(...) -> _t.Union[bool, _t.Optional[int]]

def set_many(...) -> _t.Union[bool, _t.List[_t.Any]]

def has(self, key: str) -> _t.Union[bool, int]

...we can see how delete_many, set_many and has methods have different return types across different cache clients. This means our cache types are diverging from the common interface, which is bad AFAIK since it makes the cache types not interchangeable (code written for a given cache type might not work for others) and it's also less intuitive for the user ("set_many returns a boolean for cache X but a list for Y (?)")...

Being able to swap between different cache types without the need to change any code with a 100% guarantee that it will work is something I would like to see in cachelib. For that, I plan on writing a PR to minimize as much as possible the differences in our API (described above) and finally turn BaseCache into a formal interface with something like python abc, thus enforcing it for all supported cache types (and the ones possibly yet to come). This would also ensure the project grows in a uniform way, always abiding to BaseCache.

Since this is a fairly large change and would touch the public API I would like to hear what people have to say. Any thoughts are welcome :bowtie:

Tests corresponding to test_memcached_cache.py file are failing on the Linux platform.

I am building and testing cachelib on the Linux platform. I have followed the below steps to build and test cachelib:

BUILD STEPS:

$ git clone https://github.com/pallets/cachelib
$ cd cachelib
$ python3 -m venv env
$ . env/bin/activate
$ pip install -e . -r requirements/dev.txt
$ pre-commit install

TEST STEPS:

$ pytest

Build is successful, but tests corresponding to test_memcached_cache.py file are failing. Below is the short test summary after pytest:

E  TimeoutError: The provided start pattern server listening could not be matched within the specified time interval of 120 seconds

env/lib/python3.9/site-packages/xprocess.py:284: TimeoutError
======================================================== short test summary info ========================================================
ERROR tests/test_memcached_cache.py::TestMemcachedCache::test_clear - TimeoutError: The provided start pattern server listening could ...
ERROR tests/test_memcached_cache.py::TestMemcachedCache::test_set_get - TimeoutError: The provided start pattern server listening coul...
ERROR tests/test_memcached_cache.py::TestMemcachedCache::test_set_get_many - TimeoutError: The provided start pattern server listening...
ERROR tests/test_memcached_cache.py::TestMemcachedCache::test_get_dict - TimeoutError: The provided start pattern server listening cou...
ERROR tests/test_memcached_cache.py::TestMemcachedCache::test_delete - TimeoutError: The provided start pattern server listening could...
ERROR tests/test_memcached_cache.py::TestMemcachedCache::test_delete_many - TimeoutError: The provided start pattern server listening ...
ERROR tests/test_memcached_cache.py::TestMemcachedCache::test_add - TimeoutError: The provided start pattern server listening could no...
ERROR tests/test_memcached_cache.py::TestMemcachedCache::test_inc_dec - TimeoutError: The provided start pattern server listening coul...
ERROR tests/test_memcached_cache.py::TestMemcachedCache::test_expiration - TimeoutError: The provided start pattern server listening c...
ERROR tests/test_memcached_cache.py::TestMemcachedCache::test_has - TimeoutError: The provided start pattern server listening could no...
========================================= 34 passed, 1 skipped, 10 errors in 137.85s (0:02:17) ==========================================

Please find the detailed error logs here: cachelib_memcache_failing_tests_logs.txt

Memcache was not installed on my system, so these tests were getting skipped.
Then I successfully installed memcache, libmemcached-dev, zlib1g-dev, libmemcached-tools, memcached from the apt repo; and installed pylibmc, pymemcache and python-memcached using pip. But the issue is always the same, tests are failing.
I have restarted and checked the memcache service using service memcache restart...tests are failing even after memcache is in active state.

Can you please provide me with the pointers on how to pass these failing tests?
Please guide me if some more information is required.

Environment:

  • Python version: 3.8
  • Cachelib version: master branch

FIPS Compliance

Leveraging this Library in a FIPS Enforced environment causes the application using this library to be halted. Although the use of md5 here is valid, it is caught by the FIPS Enforced environment.

hash_method: _t.Any = md5,

Cachelib on heavy load

Flask-Session uses Cachelib for filesystem session storage

A Flask based webapp was tested with Apache Benchmark tool, sending 1000 requests with concurrency 100.
After approx. 500 requests, apache logs start being flooded with warnings like this:

[Wed Mar 20 19:03:07.814067 2024] [wsgi:error] [pid 23776] [client 10.0.11.184:45501] Traceback (most recent call last):
[Wed Mar 20 19:03:07.814068 2024] [wsgi:error] [pid 23776] [client 10.0.11.184:45501]   File "/usr/local/lib/python3.8/site-packages/cachelib/file.py", line 122, in _remove_expired
[Wed Mar 20 19:03:07.814069 2024] [wsgi:error] [pid 23776] [client 10.0.11.184:45501]     expires = struct.unpack("I", f.read(4))[0]
[Wed Mar 20 19:03:07.814071 2024] [wsgi:error] [pid 23776] [client 10.0.11.184:45501] struct.error: unpack requires a buffer of 4 bytes

To reproduce use Flask-Session in a web app
then test it with

ab -n 1000 -c 100 http://web-app.addr

and check error logs

Environment:

  • Python version: 3.8
  • CacheLib version: 0.12.0
  • Centos version 7.7
  • Flask-Session 0.3.2

Not sure, but may be ulimit for open files is affecting

su - apache -s /bin/bash  -c 'ulimit -Ha' | grep 'open files' 
open files                      (-n) 4096

su - apache -s /bin/bash  -c 'ulimit -Sa' | grep 'open files' 
open files                      (-n) 1024

sign cache values

I have a suggestion: It should be possible to sign (/ apply HMAC) to cache values in the same way werkzeug.contrib.securecookie does already.

pickle is used as serializer to serialize the content. While this is absolutely fine as long nobody can access the underlying cache back end (Redis, FS, Memcached), it may allow privilege escalation once an attacker gains access to it, as pickle allows to store arbitrary code.

Proposal:

  1. Add a warning to the documentation.
  2. Add the option pass a signing key to sign the results and raise a warning if no signing key is passed at initialization.
  3. Deprecate not using a signing key and ultimately enforce using one.

Practically pallets' ItsDangerous could be used here.
If wanted, I can create a pull request implementing my proposal.

Failing tests (and isolating network tests to be skipped)

Sometimes it's useful to have the tests that use the network marked so they can be skipped easily when we know the network is not available.

This is useful for example on SUSE and openSUSE's build servers. When building our packages the network is disabled so we can assure reproducible builds (among other benefits). With this mark, it's easier to skip tests that can not succeed.

The %check section of our SPEC file is:

%check
# set up working directory
export BASETEMP=$(mktemp -d -t cachelib_test.XXXXXX)
trap "rm -rf ${BASETEMP}" EXIT
# Allow finding memcached
export PATH="%{_sbindir}/:$PATH"
export PYTEST_ADDOPTS="--capture=tee-sys --tb=short --basetemp=${BASETEMP}"
%pytest -rs

(%pytest means basically pytest -v with some variables set to take into consideration building environment).

When running only plain pytest, I get result “11 failed, 117 passed, 1 skipped, 3 errors”.

Complete build log in this situation

Obviously the hot candidate are tests using DynamoDB, which is completely inaccessible, so I have created this patch to mark these tests as network-requiring so they can be easily skipped:

---
 setup.cfg                          |    3 +++
 tests/test_dynamodb_cache.py       |    1 +
 tests/test_interface_uniformity.py |    1 +
 tests/test_redis_cache.py          |    2 +-
 4 files changed, 6 insertions(+), 1 deletion(-)

--- a/setup.cfg
+++ b/setup.cfg
@@ -34,11 +34,14 @@ python_requires = >= 3.7
 where = src
 
 [tool:pytest]
+addopts = --strict-markers
 testpaths = tests
 filterwarnings = 
        error
        default::DeprecationWarning:cachelib.uwsgi
        default::DeprecationWarning:cachelib.redis
+markers =
+       network: mark a test which requires net access
 
 [coverage:run]
 branch = True
--- a/tests/test_dynamodb_cache.py
+++ b/tests/test_dynamodb_cache.py
@@ -29,5 +29,6 @@ def cache_factory(request):
         request.cls.cache_factory = _factory
 
 
+@pytest.mark.network
 class TestDynamoDbCache(CommonTests, ClearTests, HasTests):
     pass
--- a/tests/test_interface_uniformity.py
+++ b/tests/test_interface_uniformity.py
@@ -19,6 +19,7 @@ def create_cache_list(request, tmpdir):
     request.cls.cache_list = [FileSystemCache(tmpdir), mc, rc, SimpleCache()]
 
 
+@pytest.mark.network
 @pytest.mark.usefixtures("redis_server", "memcached_server")
 class TestInterfaceUniformity:
     def test_types_have_all_base_methods(self):

Just by applying this patch (and adding -k "not network" to my call of pytest) I get much better results: “117 passed, 1 skipped, 12 deselected, 2 errors”. Again, Complete build log in this situation.

Unfortunately, I don’t know how to skip those remaining two erroring tests. Both of them use so complicated constructs that I don’t know where to put @pytest.mark.skip or @pytest.mark.network and any of my attempts failed to make any difference. The only method which actually works (but I really don’t like it) is --ignore=tests/test_redis_cache.py --ignore=tests/test_memcached_cache.py, which truly make test suite to pass.

Any ideas, how to make the test suite working even without network access? Do I do something wrong in arranging my test environment?

Environment:

  • Python version: various versions, this particular errors are from “Python 3.8.16”
  • CacheLib version: 0.10.2 from the tarball from PyPI.

RedisSerializer.dumps() implementation contradicts with its doc string

The doc string of RedisSerializer.dumps() suggest an integer value will be serialized as a string, but the actual implementation will seriaize any value by pickle regardless of its type.

Duplicated the code here for reference.

 def dumps(self, value: _t.Any, protocol: int = pickle.HIGHEST_PROTOCOL) -> bytes:
    """Dumps an object into a string for redis. By default it serializes
    integers as regular string and pickle dumps everything else.
    """
    return b"!" + pickle.dumps(value, protocol)

Admittedly, this won't cause run time error as the RedisSerializer.loads() can still handle this case. However, to avoid confusion, either the docstring or the code should be updated to bring them into agreement. Preferablley, the code should be updaed because serailizing int as string is the legacy behaviour when the redis serialization code was still contained in flask-caching repo.

Environment:

  • Python version: irrelevant
  • CacheLib version: v2.0.1

Breaking changes on 0.4.0

There is a type hinting on line 31 of redis.py for the host to be string and also if host is not string (in case of rediscluster) then in that case the self.__client is not available on line 51 of redis.py as the "else:" is removed. For everyone who are sending rediscluster client in host, this change will break their application

Environment:

  • Python version: 3.7
  • CacheLib version: 0.4.0

Add typing with mypy

Type hints help with maintaining a clear project architecture, debugging, code documentation, linting (due to static analysis) and several other benefits. Following other pallets projects I plan on adding typing to cachelib too. The issue is to open this for discussion in case anyone has suggestions 🚀

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.