Code Monkey home page Code Monkey logo

Comments (4)

laurentS avatar laurentS commented on May 20, 2024 1

Hi @karolzlot
I don't think slowapi will do what you want. If a request takes more than 1 second to complete, when the second request comes in, let's say 1.1 sec after the 1st request started, the limiter backend would be queried, and return nothing (because the time window has expired), so the second request could go ahead, and I don't think this is what you want.
You could probably hack the code to replace timebased windows with something that expires when your request completes, but at that point I think what you need is something like a mutex or semaphore, which will be less cumbersome.

I'd be curious to know what your use case is, if you can share?

from slowapi.

karolzlot avatar karolzlot commented on May 20, 2024

I would use โ€˜1 per secondโ€™ granularity.
But I wonder if concurrency will still be limited if requests take more than 1 second in this case?

from slowapi.

karolzlot avatar karolzlot commented on May 20, 2024

Thank you @laurentS

I was thinking about using file locks, but I will also check your suggestions about mutex and semaphore.


My use case:

I have many endpoints which work like this:

  1. Database is queried to check for new tasks to do. (For example take rows which have A column filled, but B column empty)
  2. Do all tasks (in for loop)
  3. Return "successfully finished /endpoint_abc" or similar message

Those endpoints are usually fired soon after new tasks are available (and also every 10 minutes just in case). In case two or more tasks will become available in the same time it could cause race conditions (each tasks would be executed two or more times which would result in for example two the same emails sent or two the same invoices issued).

To not worry about race conditions the easiest solution is to limit concurrency to 1. Benefit from concurrency speedup wouldn't be important in this case anyway, and I could always use async inside endpoint if needed.

Those endpoints doesn't take any REST parameters (it's by design). They just are fired and they know what to do, they only return short message to tell that everything went OK.

Because I need to limit concurrency to exactly 1, I don't need to worry about solution which would work also on multi-server setup (I don't need multi-server for endpoints which are anyway limited to 1 concurrency ๐Ÿ˜„ )

In past I used two solutions for limiting concurrency:

  1. Flask server in debug mode (it is by design limited)
  2. Run app in Google Cloud Run and set concurrency to 1

Now I just want to do the same, but with FastAPI and without Google Cloud Run. Also those solutions limit concurrency per server, but preferably I would like to limit concurrency per endpoint.

Also the best solution would be if endpoint could just wait a bit if the same endpoint is already executing at the same time.

from slowapi.

karolzlot avatar karolzlot commented on May 20, 2024

Now as I more read about it I think some kind of database lock may suit me better.

It's not that I need to limit concurrency per endpoint, it's more that I need to limit concurrency per row in database.

from slowapi.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.