Code Monkey home page Code Monkey logo

Comments (15)

thentgesMindee avatar thentgesMindee commented on June 16, 2024 2

@RohitPShetty

oh ok. Thanks for the insight @laurentS. So, in cases wherein multiple container instances of my service(internally using FastAPI) are spawned, then these limits based on client_id will depend on which container the requests reach?

Exactly, if you want to share those limits you have to use some kind of persistent storage such as Redis for example.

Also, just curious, how do i write a limiter using SlowAPI which imposes a blanket limit of 50 requests per minute to an endpoint like "/query". Unclear on what the "key_func" should return as I'am not really imposing limits based on a key such as Remote address or client id or any other key.

global_limiter = Limiter(key_func=lambda ?????)

Returning any string from key_func would work, as long as key_func return the exact same string for each call

from slowapi.

laurentS avatar laurentS commented on June 16, 2024 1

You can absolutely use memory storage, but you need to be aware of the limitations:

  • if you have multiple fastapi processes (often the case in production), then each one will use a separate store, which could lead to the problem you're seeing.
  • the data stored will only live as long as the memory storage process, so when you restart your server, limits will be reset, which could be a problem depending on your use case.

If the above are not acceptable for your needs, you probably want to switch to something like redis instead.

from slowapi.

laurentS avatar laurentS commented on June 16, 2024 1

@RohitPShetty I suspect your code has a small bug which we missed:

client_id_limiter = Limiter(
key_func=lambda request: request.headers.get("client-uuid"),
)

global_limiter = Limiter(
key_func=lambda request: request.url.path
)

# global_limiter is never attached to the app
app.state.limiter = client_id_limiter
# instead use a single `Limiter`
app.add_exception_handler(RateLimitExceeded, _rate_limit_exceeded_handler)

@router.post(
"/query")
@client_id_limiter.limit("3/minute")
@global_limiter.limit("5/minute")
async def user_query(
request: Request
)

The tests actually have an example of exactly what you want to do:

def test_multiple_decorators(self, build_fastapi_app):

limiter  = Limiter()

@router.post("/query")
@limiter.limit("3/minute", key_func=lambda request: request.headers.get("client-uuid"))
@limiter.limit("5/minute", key_func=lambda: "shared_by_everyone")
async def user_query(
request: Request
)

from slowapi.

laurentS avatar laurentS commented on June 16, 2024

Hi @RohitPShetty, I don't have time to dig into this right now, but have you tried looking into the storage backend to see what keys are in there? It might give you some hints as to what is going wrong. You could also log what goes into hit here:

if not self.limiter.hit(lim.limit, *args, cost=cost):

What storage backend do you use?

from slowapi.

RohitPShetty avatar RohitPShetty commented on June 16, 2024

Hi @laurentS , I currently do not use any explicit storage back-end. I read that slow api uses in memory storage by default. is that true or do I need to explicilty use a storage backend?

from slowapi.

RohitPShetty avatar RohitPShetty commented on June 16, 2024

oh ok. Thanks for the insight @laurentS. So, in cases wherein multiple container instances of my service(internally using FastAPI) are spawned, then these limits based on client_id will depend on which container the requests reach?

Also, just curious, how do i write a limiter using SlowAPI which imposes a blanket limit of 50 requests per minute to an endpoint like "/query". Unclear on what the "key_func" should return as I'am not really imposing limits based on a key such as Remote address or client id or any other key.

global_limiter = Limiter(key_func=lambda ?????)

from slowapi.

RohitPShetty avatar RohitPShetty commented on June 16, 2024

@thentgesMindee @laurentS Thank you for the explanation.

Going back to my original question, how can we achieve something like this even though we are using in-memory storage?

Adding multiple limiters on a single endpoint "/query".

The 1st limiter will utilise a client-id extracted from the request and will impose a restriction of 3 requests per minute on the client while accessing the "/query" endpoint.

The 2nd limiter will be a global limiter which imposes restriction of 5 requests per minute irrespective of the client while accessing the "/query" endpoint.

The ideal behaviour should be that when i try hitting the endpoint "/query" with 5 requests each from 2 different clients, then 3 requests from one client and 2 requests from another client should be accepted. However, in reality 3 requests from each client are accepted. Which means that the global limit is not working.

from slowapi.

thentgesMindee avatar thentgesMindee commented on June 16, 2024

The ideal behaviour should be that when i try hitting the endpoint "/query" with 5 requests each from 2 different clients, then 3 requests from one client and 2 requests from another client should be accepted. However, in reality 3 requests from each client are accepted. Which means that the global limit is not working.

It should work already
You should try again with a single worker of your app running since you're using an in-memory storage

from slowapi.

RohitPShetty avatar RohitPShetty commented on June 16, 2024

yes, i have verified that I have used a single worker of my app. But still the global limit doesn't work simultaneously with the client_id limit

from slowapi.

thentgesMindee avatar thentgesMindee commented on June 16, 2024

can you paste the code you use for testing ?

from slowapi.

RohitPShetty avatar RohitPShetty commented on June 16, 2024

Using a simple curl command with an for loop across 2 terminals with slight difference in client_id to simulate 2 clients:

for i in {1..7}; do
curl -X POST
-H "Authorization: Bearer your_access_token"
-H "client-uuid: 3fa85f64-5717-4562-b3fc-2c963f66afa6"
-H "Content-Type: application/json"
-d '{"query": "Valid user query"}'
http://localhost:8000/user-query &
done

from slowapi.

thentgesMindee avatar thentgesMindee commented on June 16, 2024

Good catch @laurentS
Another option, if you really need both limiter defined differently (for example if you want different storage configuration) could be to make use the global_limiter without any decorator, using its default limit

from slowapi.

RohitPShetty avatar RohitPShetty commented on June 16, 2024

@laurentS Thanks for looking into this. I had also tried attaching the global_limiter to the app but had not posted it in the code here. Even that hadn't worked.

Currently trying out the single limiter approach you have mentioned. Have a doubt on the creation of limiter in above code you have posted.

limiter = Limiter()

I keep getting the "key_func" missing in init error for this as below:

TypeError: Limiter.__init__() missing 1 required positional argument: 'key_func'

These are my imports:

from slowapi import Limiter, _rate_limit_exceeded_handler from slowapi.errors import RateLimitExceeded

@laurentS @thentgesMindee Could you pls help me with this bit?

from slowapi.

RohitPShetty avatar RohitPShetty commented on June 16, 2024

@laurentS @thentgesMindee Any pointers on tackling the above error while creating the limiter?

from slowapi.

laurentS avatar laurentS commented on June 16, 2024

@RohitPShetty this is a python error, not something specific to slowapi. Python is telling you that an argument (key_func) is missing in your call to Limiter(). You can just add one here, which will act as the default value for key_func. If you keep your code as we discussed it above, specifying key_func for each .limit() then the default value does not really matter, you can pass anything.

You could use the default value for your shared "global limiter" and do something like:

limiter = Limiter(key_func=lambda: "shared_by_everyone")

@router.post("/query")
@limiter.limit("3/minute", key_func=lambda request: request.headers.get("client-uuid"))
@limiter.limit("5/minute") # omit key_func here and the default will be used
async def user_query(
request: Request
)

Closing this issue as resolved now.

from slowapi.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.