Comments (4)
Hi @karolzlot
I don't think slowapi
will do what you want. If a request takes more than 1 second to complete, when the second request comes in, let's say 1.1 sec after the 1st request started, the limiter backend would be queried, and return nothing (because the time window has expired), so the second request could go ahead, and I don't think this is what you want.
You could probably hack the code to replace timebased windows with something that expires when your request completes, but at that point I think what you need is something like a mutex or semaphore, which will be less cumbersome.
I'd be curious to know what your use case is, if you can share?
from slowapi.
I would use โ1 per secondโ granularity.
But I wonder if concurrency will still be limited if requests take more than 1 second in this case?
from slowapi.
Thank you @laurentS
I was thinking about using file locks, but I will also check your suggestions about mutex and semaphore.
My use case:
I have many endpoints which work like this:
- Database is queried to check for new tasks to do. (For example take rows which have A column filled, but B column empty)
- Do all tasks (in
for
loop) - Return "successfully finished /endpoint_abc" or similar message
Those endpoints are usually fired soon after new tasks are available (and also every 10 minutes just in case). In case two or more tasks will become available in the same time it could cause race conditions (each tasks would be executed two or more times which would result in for example two the same emails sent or two the same invoices issued).
To not worry about race conditions the easiest solution is to limit concurrency to 1. Benefit from concurrency speedup wouldn't be important in this case anyway, and I could always use async inside endpoint if needed.
Those endpoints doesn't take any REST parameters (it's by design). They just are fired and they know what to do, they only return short message to tell that everything went OK.
Because I need to limit concurrency to exactly 1, I don't need to worry about solution which would work also on multi-server setup (I don't need multi-server for endpoints which are anyway limited to 1 concurrency ๐ )
In past I used two solutions for limiting concurrency:
- Flask server in debug mode (it is by design limited)
- Run app in Google Cloud Run and set concurrency to 1
Now I just want to do the same, but with FastAPI and without Google Cloud Run. Also those solutions limit concurrency per server, but preferably I would like to limit concurrency per endpoint.
Also the best solution would be if endpoint could just wait a bit if the same endpoint is already executing at the same time.
from slowapi.
Now as I more read about it I think some kind of database lock may suit me better.
It's not that I need to limit concurrency per endpoint, it's more that I need to limit concurrency per row in database.
from slowapi.
Related Issues (20)
- How to call in APIRouter'? HOT 2
- Add support for python 3.11 HOT 6
- New release/tag HOT 2
- Can global limits make this in FastAPI? HOT 3
- [QUESTION] redis asyncio-compatible? HOT 2
- Installation failure with poetry.core 1.5
- Can't set RATELIMIT_DEFAULT in .env HOT 4
- Dependency Dashboard
- Hi @smittysmee this is not my day job, it's 100% volunteer work, so priority may fall behind. The process for publishing an update is a bit manual still. If you want to give a hand with opening a PR to prepare for a release 0.1.8, I think a lot of people will be grateful. Examples from past releases #120 or #108. As a policy (see #58 ), I add any contributor who's had at least one PR merged to the repo, to help reduce bottlenecks. You're welcome to join the team!
- Features
- Feature | Can this support individual limits by IP address or user token? HOT 1
- slowapi uses limits==2.8.0 which contains usage of deprecated pkg_resources
- Bug: slowapi shared limiter does not accept callable for scope parameter HOT 3
- add multiple limiters on a single route. HOT 15
- The limiters of different routes with the same function name are confused, resulting in multiple checks HOT 1
- [Ehancement] Don't require `Request` parameter to be present in endpoint function signature. HOT 4
- Add as package in arch user repository
- Exception: No "request" or "websocket" argument on function HOT 3
- Need Rate Limiting on Basis of Request Body for a POST API HOT 1
- mypy error: type of `_rate_limit_exceeded_handler` is incompatible with latest starlette HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from slowapi.