Comments (8)
@adamchainz good to know, will take a look!
@mahyard the current main branch contains significant breaking changes, including a rename of the module from ratelimit
to django_ratelimit
. That's one of the reasons a new release isn't immediately forthcoming. There are other significant breaks (like changing the default value of the block=
kwarg) under consideration, and I'd rather not have to do two major releases back to back.
Also, regardless of how far away mid-Oct is, I cannot guarantee my free time between now and then. This is a project I care about but not one that I am paid to work on. Please be understanding of that.
from django-ratelimit.
OK I made a couple of clean-up changes to get to 3.2 first: #248 and #249.
Basically I think there are three ways we can go about this:
- Don't parametrize builds over cache backend. Treat additional Memcached backends like we treat memcached/redis right now—i.e. install them all, define different caches in the settings, set
RATELIMIT_USE_CACHE
explicitly for certain tests. We might need to write some additional tests to ensure certain behavior works across different cache backends. We'd probably want to spin up memcached and redis and do some real integration tests. This would probably push me toward "break up the test file into its own folder/module", too. - Parametrize the builds across cache backends and run the full test suite against each one. I'd say the set would be loc-mem, redis, python-memcached, pymemcache, and pylibmc. We could target the build matrix so that pymemcache only ran on Django 3.2. This would be more work, but would be much more thorough. It would likely mean dropping tox and saying GH Actions are just how we do it.
- Don't worry too much about this, ignore pylibmc, and just configure the build to switch from python-memcached to pymemcache on Django 3.2.
I don't have a strong opinion between 1 and 2, but I am slightly leaning toward number 2, "parametrize". I don't like 3: it's the easiest but would likely mean that pylibmc was broken and we wouldn't know it.
Some initial thoughts about what 2 might look like
- Remove the dummy cache test, it's not doing anything important.
- Add redis and memcache service containers to the action definition. Ideally there'd be a way to condition that on this parametrization but I'm happy to make that a follow-up.
- In the test settings, make the
CACHES
setting depend on an environment variable specifying the backend/client we're using. It would define something likedefault
anddefault-connection-error
as below. - Rewrite the existing connection error tests to define the behavior, regardless of cache backend.
# in test_settings.py
cache_type = os.getenv('CACHE')
if cache_type == 'python-memcached':
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
'LOCATION': 'service-container:11211',
},
'default-connection-error': {
'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
'LOCATION': 'invalid-host:1234',
},
}
elif cache_type == 'pymemcached':
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.memcached.PyMemcachedCache',
# ... etc ...
}
elif cache_type == 'pylibmc':
# ...
elif cache_type == 'redis':
# ...
elif cache_type == 'locmem':
# ...
else:
raise ImproperlyConfigured('unknown cache backend specified for tests')
and so on. I'm not 100% sure how to handle "connection error" tests for LocMem, but if we're going to special case anything in tests, I'd rather it be LocMem—which is primarily a development tool, because it can't fail to connect—than one of the real ones.
This is also an eventual look at test_settings.py
. I would not want to try to add pymemcache and pylibmc at the same time, I think that introduces too much risk and uncertainty. At first it would just be python-memcached, redis, and locmem. Then we could add pylibmc (it's been around longer) and then pymemcache separately, and deal with questions about how to manage exceptions from each library in the actual code independently of getting the test runner into shape to handle it.
from django-ratelimit.
Hi @jsocol,
As much time you put into the planning phase, you'll have better software at the end. so I totally agree with you about taking one step back and review the test procedure.
however, I have some notes I hope you consider:
- Actually I feel much more comfortable with tox versus Github actions. Because GH needs a commit/push, has a delay, and runs all the tests each time. on the other hand, tox works offline and can test a single environment, based on your demands. I don't know how much I can help you here in the future but I want to ask you to keep this feature because many developers may be like me.
- As you may know, we at edX are on a deadline to bump our Django to version 3.2 by mid-October. Considering the fact that django-ratelimit without any changes, supports Django 3.2 using python-memcached/locmem as its cache backend, can we expect a release tag before attempting to give support to other backends? I believe that there are some improvements after v3.0.1
from django-ratelimit.
I like point 2, Why drop tox? Check out the configuration at django-cachalot so we can keep tox and still use GitHub actions.
I agree with not needing to test pylibmc for now (though the errors generated there are the exact same as pymemcache). I think we should just skip locmem tests for conn errors since it's just a Python dictionary.
Also please don't feel pressured to release soon. If someone needs to use this software, they should vendor it and manage it themselves.
from django-ratelimit.
I don’t really care about dropping tox specifically, I just want one way to manage the tests. But I am worried about the lack of ability to define things like memcached and redis dependencies, if we go that way. I’ll check out the cachalot test config!
As for releasing soon, it’s probably ok to say 3.2 is supported and that while pylibmc are pymemcache are considered experimental, issues with them are bugs that we’ll work to fix. Maybe 4.0 is just the current set of changes and they can come in 4.1 or 4.2.
However I can’t make promises about my time and availability. I try to make progress here but also need to spend time away from coding outside of work. If your on a deadline @mahyard , I’d recommend either vendoring the current snapshot, or another known-working commit, or installing directly from git with a specific commit tagged.
from django-ratelimit.
FYI on tox + github actions, you can check my projects, e.g. https://github.com/adamchainz/django-htmx or https://github.com/adamchainz/django-mysql/ . I run one github actions container per python version and then use a little tox plugin I wrote to run all tox environments matching that python version.
from django-ratelimit.
Indeed I didn't mean to ask for a major release. what I was looking for is a minor version containing just the current changes under the assumption that the latest commit on the master branch doesn't break anything on v3.0.1. it could be named v3.0.2 for example.
BTW there is a long time until mid-October. besides, while we do not intend to vendor an active project, using a specific commit id could be an option I guess (It's a not-recommended approach).
from django-ratelimit.
absolutely there is no rush, no guarantee asked, and I didn't want you to do something more than what you are trying to do. I'm thankful for your efforts and what you and other contributors here provided for the IT world is really precious. I don't know maybe my bad English caused you to think that I want to force you to do more.
Anyway, I apologize.
from django-ratelimit.
Related Issues (20)
- Update package config to pyproject.toml
- `mixins.py` removed from `4.0` HOT 1
- Wrong project link on pypi HOT 4
- Unable to rate-limit on arg/kwarg value HOT 2
- Support for Django's built in `RedisCache` cache backend
- get_usage on class based view
- Expire time of Multiple ratelimit on same view not seem correct.
- Incorrect homepage link on PyPI HOT 2
- Feature request: Attach information about rate-limit violations to request object for use in custom middleware
- Blacken Codes HOT 1
- Add async support? HOT 18
- Add a configurable cache key timeout
- Add informational headers for rate-limit HOT 4
- Wrong Status Code HOT 1
- Alias function for is_ratelimited HOT 2
- Documentation for custom get_usage / is_rate_limited needed. HOT 1
- RATELIMIT_VIEW Not Functioning as Expected with Class-Based Views HOT 14
- Is there a way to clear the ratelimit for a user ? HOT 2
- Is sliding time window rate limiting supported? HOT 3
- @ratelimit(key='post:username', rate='5/m') not working. It does not base limit from post data.
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from django-ratelimit.