Code Monkey home page Code Monkey logo

fastapi-redis-cache's Introduction

a-luna's github stats

๐Ÿ‘จ๐Ÿฝโ€๐Ÿ’ป What Type of Code Did I Write This Week?

From: 02 August 2024 - To: 09 August 2024

Python   6 hrs 26 mins   โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–’โ–‘   93.16 %
Bash     26 mins         โ–ˆโ–“โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘   06.31 %
JSON     1 min           โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘   00.35 %
CSV      0 secs          โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘   00.13 %
Text     0 secs          โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘   00.04 %

Coding metrics are powered by Wakatime

  • Add Search to Your Static Site with Lunr.js (Hugo, Vanilla JS) Jun 30 2020 โ€” I decided to document how I implemented a search feature with Hugo and Lunr.js on my personal blog site. Since this is a static site the search functionality is performed entirely within the client's browser. My solution uses vanilla JS DOM manipulation to render the search results. I believe that my approach includes features that are markedly different from the implementations I encountered while researching this task, features which enhance the overall search UX.

  • An Introduction to Decorators in Python Feb 27 2020 โ€” Decorators can be a daunting topic when first encountered. While the Zen of Python states "There should be one-- and preferably only one --obvious way to do it", there are many, equally valid ways to implement the same decorator. These different methods can be categorized as either function-based, class-based, or a hybrid of both. In this post I will explain the design and behavior of Python decorators and provide examples of decorators that I frequently use in my own code.

  • Hugo: Add Copy-to-Clipboard Button to Code Blocks with Vanilla JS Nov 13 2019 โ€” Hugo includes a built-in syntax-highlighter called Chroma. Chroma is extremely fast since it is written in pure Go (like Hugo) and supports every language I can think of. Chroma's speed is especially important since syntax highlighters are notorious for causing slow page loads. However, it lacks one vital feature โ€” an easy way to copy a code block to the clipboard. I decided to document my implementation using only vanilla JS since every blog post I found for this issue relied on jquery to parse the DOM, which is completely unnecessary at this point.

fastapi-redis-cache's People

Contributors

a-luna avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

fastapi-redis-cache's Issues

add support for if-modified-since directive

the project which led me to create this plugin doesn't serve any assets that make sense to validate with the last-mod time, (e.g., files) since the response data in my project is always retrieved from a database.

However, it would be pretty silly to assume that all websites are the same as mine, so supporting the if-modified-since directive is definitely required.

cache-control: no-cache behavior is wrong

per MDN, the HTTP 1.1 spec, and every other authoritative resource on this subject, the cache-control no-cache directive is NOT used to indicate that a response should not be cached.

This is rather unintuitive, but I probably should have known this before I went and built a caching plugin for FastAPI. I need to figure out exactly how to fix this, which will probably reveal even more places where I have violated the most basic of caching rules.

A Rebuild Maybe ?! ....

Feedback on Your Package

I came across your package on the internet, and at first, I was excited about it's going to be crazy , A new feature that hasn't been released yet.

However, my enthusiasm turned into disappointment when I realized that it didn't meet my requirements and i was wondered why it's not working, so i checked your source .

It was like a simple caching system to me that just cache A DATA. well i remembered someone said this on a blog about choosing between react , svelte he said why don't you go and build a package that in the future someone see and says wow this is so bad and build a better one from that.

I have to say this why didn't you used __init__ instead of init and making a extra calling on it ..??!!!.

I just wanted to say i rebuilt you package and it's better now .

Cyrus-Kit on PyPI

Of course . still have some bugs but hope someone find this in the future and make a better version of this XD

add async support for redis

since redis-py is support async natively, is there any plan to add async support? That will be more powerful intergrated with FastAPI.

Add support for authentication to a remote redis

Hello,

I am trying to connect my fastapi application to a redis server instance hosted on redis_labs and it requires me to provide my credentials(username and password) before i can connect to the server. unfortunately i can't find a way to provide my credentials when instantiating the FastApiRedisCache() class.

Is there a walkaround to this?

@cache Decorator Not Working

Hi Team,
I have followed the steps given in https://pypi.org/project/fastapi-redis-cache/.

I have added the following methods in main.py and observation is as follows: (Please note I am using the Tryout option of FastAPI docs to test the endpoint). Please help to resolve this issue. I am using python 3.11.
Observation:

INFO: 127.0.0.1:62749 - "GET /dynamic_data HTTP/1.1" 200 OK
INFO:fastapi_redis_cache.client: 05/18/2023 06:53:13 PM | KEY_ADDED_TO_CACHE: key=myapi-cache:main.get_dynamic_data()

Response Header:

cache-control: max-age=30
content-length: 72
content-type: application/json
date: Thu,18 May 2023 10:53:13 GMT
etag: W/2847227004069749289
expires: Thu,18 May 2023 10:53:43 GMT
server: uvicorn
x-myapi-cache: Miss

Note:-

When the end point is being executed for second time, the status of x-myapi-cache should be Hit and as follows. But it is not happening

x-myapi-cache: Hit

{
"message": "this data should only be cached temporarily",
"success": true
}

Method:

Will be cached for thirty seconds

@app.get("/dynamic_data")
@cache(expire=30)
def get_dynamic_data(request: Request, response: Response):
return {"success": True, "message": "this data should only be cached temporarily"}

mani.py

import os

from fastapi import FastAPI, Request, Response
from fastapi_redis_cache import FastApiRedisCache, cache

LOCAL_REDIS_URL = "redis://143.42.77.29:6379"

app = FastAPI(title="FastAPI Redis Cache Example")

@app.on_event("startup")
def startup():
redis_cache = FastApiRedisCache()
redis_cache.init(
host_url=os.environ.get("REDIS_URL", LOCAL_REDIS_URL),
prefix="myapi-cache",
response_header="X-MyAPI-Cache",
ignore_arg_types=[Request, Response]
)

Will be cached for thirty seconds

@app.get("/dynamic_data")
@cache(expire=30)
def get_dynamic_data(request: Request, response: Response):
return {"success": True, "message": "this data should only be cached temporarily"}

Use cache() decorator outside of fastapi

Hi i'm trying to use @cache() decorator using a worker like celery/dramatiq.
Is there a way to use decorator outside of fastapi?

@cache()
def long_function_query(word):
     pass # do long running stuff.

Feature request - async Redis client

Hi,
Thanks for the great library.

I was looking at the code and I saw that it supports caching async endpoints, but not yet support the use of a Redis async client.

redis-py (https://github.com/redis/redis-py/) now since v4.2.x supports async connection to Redis, so it should be possible to take advantage of that and add it as an option next to the current sync Redis client.
I believe this would be a great addition and would make this library perfect.

Ps.: If you want, I could make a PR for it once I have the time.

handling dirty cache entries

Hi,

Do you any recommendation how to handle cache entries that may have become dirty?
e.g. if you have a PUT endpoint modifying a ressource that may be in my cache, I guess the caching mechanism in fast-redis-cache's code will not be aware by pure magic that the cache entry has become dirty. Do I have to handle (update or delete) the cache entry explicitly within the PUT code? This doesn't look very elegant to me, compared to the clean way the rest of the caching is hidden from the developer.

cheers
j.

ttl expired -> request contains eTag/last-mod -> current behavior is wrong

when a browser sends a request where the time indicated by the value of the max-age directive or expires header is in the past, the correct behavior is to:

  • revalidate the clients copy
  • if the eTag has changed or last-mod time is different than the client's copy:
    • send a 200 response, including the new version of the content
  • if the eTag has not changed or last-mod time is the same as the client's copy:
    • send a 304 response with no body content, and header values that indicate max-age/expires time

currently, when the ttl has elapsed and a request is received for the expired data, no revalidation is performed and a 200 response is always sent that includes the entire response data.

Can't Cache List of serialized data

Hi
thank you for this great module, i just have an issue that i cant cache a list of serialized dict data, i am using tortoise ORM with Pydantic Model .. my query be like
`

@app.get("/forcasts")
@cache(expire=120)
async def forcasts():
     today = datetime.today().strftime("%Y-%m-%d")
     return await Forcast_pydantic.from_queryset(Forcast.filter(date=today))

`

and i get this :
INFO:fastapi_redis_cache.client: 12/12/2022 07:13:18 PM | FAILED_TO_CACHE_KEY: Object of type <class 'list'> is not JSON-serializable: key=myapi-cache:main.forcasts()

my response that not coached correctly
[ { "id": 1, "date": "2022-12-12", "company_ar": "ุงู„ุฑูŠุงุถ", "company_en": "RIBL", "code": 1010, "market_cap": 931500, "negative_positive": "negative", "duration": 5, "profit": null, "capital_protection": "-10.2601156069375" }, { "id": 2, "date": "2022-12-12", "company_ar": "ุงู„ุฌุฒูŠุฑุฉ", "company_en": "BJAZ", "code": 1020, "market_cap": 15957, "negative_positive": "negative", "duration": 7, "profit": null, "capital_protection": "-15.9395248380128" }]

can you help me with that ?
Thanks

"Too much data for declared Content-Length" error when activating caching

The same exact code is raising exceptions when using the @cache decorator:

from fastapi import FastAPI, Request, Response
from fastapi_redis_cache import FastApiRedisCache, cache

import base_server_info
import prometheus
import settings

logger = logging.getLogger(__name__)


app = FastAPI(title="Base Server Info", version="0.1.0")


@app.on_event("startup")
def startup_event():
    logger.info("starting up")
    redis_cache = FastApiRedisCache()
    redis_cache.init(
        host_url=f"redis://{settings.REDIS_HOST}:{settings.REDIS_PORT}",
        prefix="base-server-info",
        response_header="X-base-server-info-Cache",
        ignore_arg_types=[Request, Response],
    )


@app.get("/servers")
@cache(expire=60)
async def get_servers():
    logger.info("getting servers")

    servers = base_server_info.get_base_servers_list()
    k8sNodes = base_server_info.get_kubernetes_nodes()

    all = base_server_info.consolidate_servers_info(servers, k8sNodes)

    return_value = prometheus.prepare_for_prometheus(all)

    return return_value

This code raises the exception: "Too much data for declared Content-Length"

The same code runs well when I remove the @cache decorator (no surprise here), but also works well when the redis server is unavailable (forcing the caching mechanism to evaluate the code).

Thanks in advance for any insight...

The stack trace is as below:

poc_prometheus_http_service_discovery-app-1    | WARNING:  StatReload detected changes in 'main.py'. Reloading...
poc_prometheus_http_service_discovery-app-1    | INFO:     Shutting down
poc_prometheus_http_service_discovery-app-1    | INFO:     Waiting for application shutdown.
poc_prometheus_http_service_discovery-app-1    | INFO:     Application shutdown complete.
poc_prometheus_http_service_discovery-app-1    | INFO:     Finished server process [8]
poc_prometheus_http_service_discovery-app-1    | INFO:     Started server process [9]
poc_prometheus_http_service_discovery-app-1    | INFO:     Waiting for application startup.
poc_prometheus_http_service_discovery-app-1    | INFO:fastapi_redis_cache.client: 10/25/2022 10:53:57 AM | CONNECT_BEGIN: Attempting to connect to Redis server...
poc_prometheus_http_service_discovery-app-1    | INFO:fastapi_redis_cache.client: 10/25/2022 10:53:57 AM | CONNECT_SUCCESS: Redis client is connected to server.
poc_prometheus_http_service_discovery-app-1    | INFO:     Application startup complete.
poc_prometheus_http_service_discovery-app-1    | INFO:     172.19.0.1:42710 - "GET /servers HTTP/1.1" 200 OK
poc_prometheus_http_service_discovery-app-1    | WARNING:  StatReload detected changes in 'main.py'. Reloading...
poc_prometheus_http_service_discovery-app-1    | INFO:     Shutting down
poc_prometheus_http_service_discovery-app-1    | INFO:     Waiting for application shutdown.
poc_prometheus_http_service_discovery-app-1    | INFO:     Application shutdown complete.
poc_prometheus_http_service_discovery-app-1    | INFO:     Finished server process [9]
poc_prometheus_http_service_discovery-app-1    | INFO:     Started server process [19]
poc_prometheus_http_service_discovery-app-1    | INFO:     Waiting for application startup.
poc_prometheus_http_service_discovery-app-1    | INFO:fastapi_redis_cache.client: 10/25/2022 10:57:05 AM | CONNECT_BEGIN: Attempting to connect to Redis server...
poc_prometheus_http_service_discovery-app-1    | INFO:fastapi_redis_cache.client: 10/25/2022 10:57:05 AM | CONNECT_SUCCESS: Redis client is connected to server.
poc_prometheus_http_service_discovery-app-1    | INFO:     Application startup complete.
poc_prometheus_http_service_discovery-app-1    | INFO:fastapi_redis_cache.client: 10/25/2022 10:57:11 AM | KEY_ADDED_TO_CACHE: key=base-server-info:main.get_servers()
poc_prometheus_http_service_discovery-app-1    | INFO:     172.19.0.1:54082 - "GET /servers HTTP/1.1" 200 OK
poc_prometheus_http_service_discovery-app-1    | ERROR:    Exception in ASGI application
poc_prometheus_http_service_discovery-app-1    | Traceback (most recent call last):
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 407, in run_asgi
poc_prometheus_http_service_discovery-app-1    |     result = await app(  # type: ignore[func-returns-value]
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
poc_prometheus_http_service_discovery-app-1    |     return await self.app(scope, receive, send)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/fastapi/applications.py", line 270, in __call__
poc_prometheus_http_service_discovery-app-1    |     await super().__call__(scope, receive, send)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/starlette/applications.py", line 124, in __call__
poc_prometheus_http_service_discovery-app-1    |     await self.middleware_stack(scope, receive, send)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
poc_prometheus_http_service_discovery-app-1    |     raise exc
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
poc_prometheus_http_service_discovery-app-1    |     await self.app(scope, receive, _send)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 75, in __call__
poc_prometheus_http_service_discovery-app-1    |     raise exc
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 64, in __call__
poc_prometheus_http_service_discovery-app-1    |     await self.app(scope, receive, sender)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
poc_prometheus_http_service_discovery-app-1    |     raise e
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
poc_prometheus_http_service_discovery-app-1    |     await self.app(scope, receive, send)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 680, in __call__
poc_prometheus_http_service_discovery-app-1    |     await route.handle(scope, receive, send)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 275, in handle
poc_prometheus_http_service_discovery-app-1    |     await self.app(scope, receive, send)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 68, in app
poc_prometheus_http_service_discovery-app-1    |     await response(scope, receive, send)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/starlette/responses.py", line 167, in __call__
poc_prometheus_http_service_discovery-app-1    |     await send({"type": "http.response.body", "body": self.body})
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 61, in sender
poc_prometheus_http_service_discovery-app-1    |     await send(message)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 159, in _send
poc_prometheus_http_service_discovery-app-1    |     await send(message)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 512, in send
poc_prometheus_http_service_discovery-app-1    |     output = self.conn.send(event)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/h11/_connection.py", line 512, in send
poc_prometheus_http_service_discovery-app-1    |     data_list = self.send_with_data_passthrough(event)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/h11/_connection.py", line 545, in send_with_data_passthrough
poc_prometheus_http_service_discovery-app-1    |     writer(event, data_list.append)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/h11/_writers.py", line 65, in __call__
poc_prometheus_http_service_discovery-app-1    |     self.send_data(event.data, write)
poc_prometheus_http_service_discovery-app-1    |   File "/usr/local/lib/python3.10/site-packages/h11/_writers.py", line 91, in send_data
poc_prometheus_http_service_discovery-app-1    |     raise LocalProtocolError("Too much data for declared Content-Length")
poc_prometheus_http_service_discovery-app-1    | h11._util.LocalProtocolError: Too much data for declared Content-Length

Nothing being cached

Description

Sorry for the very apt title. I'm working on this codebase: https://github.com/joeflack4/ccdh-terminology-service/tree/feature_cache

I have an app.py file where I'm importing FastApiRedisCache() and setting it up in startup().

I'm adding @cache() to several routers, one of which is in models.py. I'm using this endpoint as a test case to make sure things re working.

There's not much there at the moment, but it returns something. Here's what you can get from that endpoint as seen on our production server:
https://terminology.ccdh.io/models/
https://terminology.ccdh.io/docs#/CRDC-H%20and%20CRDC%20Node%20Models/get_models

However when I check server logs or the Redis monitor, I'm not seeing anything being cached at this or any other endpoint.

To see all the changes I've made to my codebase to implement this feature, perhaps looking at this diff in my draft pull request might help: https://github.com/cancerDHC/ccdh-terminology-service/pull/53/files

Note that this setup uses docker.

Where I've checked

1. Redis monitor

docker exec -it docker_ccdh-redis_1 sh

I don't see anything coming up as I'm checking my routes on localhost.

/data # redis-cli FLUSHALL
OK
/data # redis-cli MONITOR
OK

2. Server logs

I can't see any of the tell-tale signs of caching as per the fastapi-redis-cache documentation. I do see that the endpoints are getting hit. But they're not caching.

INFO: Application startup complete.
INFO:uvicorn.error:Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
INFO:uvicorn.error:Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)

INFO: 172.19.0.1:57602 - "GET / HTTP/1.1" 307 Temporary Redirect
INFO: 172.19.0.1:57602 - "GET /docs HTTP/1.1" 200 OK
INFO: 172.19.0.1:57602 - "GET /openapi.json HTTP/1.1" 200 OK
INFO: 172.19.0.1:57864 - "GET /models/ HTTP/1.1" 200 OK
INFO: 172.19.0.1:57864 - "GET /models/ HTTP/1.1" 200 OK
INFO: 172.19.0.1:57880 - "GET /conceptreferences?key=uri&value=1&modifier=equals HTTP/1.1" 404 Not Found
INFO: 172.19.0.1:57880 - "GET /conceptreferences?key=uri&value=1&modifier=equals HTTP/1.1" 404 Not Found
INFO: 172.19.0.1:57890 - "GET / HTTP/1.1" 307 Temporary Redirect
INFO: 172.19.0.1:57890 - "GET /docs HTTP/1.1" 200 OK
INFO: 172.19.0.1:57896 - "GET /models/ HTTP/1.1" 200 OK
INFO: 172.19.0.1:57940 - "GET / HTTP/1.1" 307 Temporary Redirect
INFO: 172.19.0.1:57940 - "GET /docs HTTP/1.1" 200 OK
INFO: 172.19.0.1:57940 - "GET /openapi.json HTTP/1.1" 200 OK
INFO: 172.19.0.1:57944 - "GET /models HTTP/1.1" 307 Temporary Redirect
INFO: 172.19.0.1:57944 - "GET /models/ HTTP/1.1" 200 OK
INFO: 172.19.0.1:57944 - "GET /favicon.ico HTTP/1.1" 404 Not Found
INFO: 172.19.0.1:57948 - "GET /models/GDC HTTP/1.1" 200 OK
INFO: 172.19.0.1:57948 - "GET / HTTP/1.1" 307 Temporary Redirect
INFO: 172.19.0.1:57948 - "GET /docs HTTP/1.1" 200 OK
INFO: 172.19.0.1:57948 - "GET /openapi.json HTTP/1.1" 200 OK
INFO: 172.19.0.1:57954 - "GET /models/GDC/entities HTTP/1.1" 200 OK

What I've tried

I tried Checking that the 'redis_url' is actually getting set.
In app.py, I set up FastApiRedisCache() as follows

    redis_cache = FastApiRedisCache()
    redis_cache.init(host_url=get_settings().redis_url)

I then entered the docker container and ran python and just checked to make sure that get_settings().redis_url existed and was correct.

# ls
Pipfile  Pipfile.lock  README.md  ccdh	crdc-nodes  data  docker  docs	env  output  pytest.ini  tests
# python
Python 3.8.11 (default, Jul 22 2021, 15:32:17)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from ccdh.config import get_settings
>>> get_settings()
Settings(app_name='TCCM API', neo4j_username='neo4j', neo4j_password='nFDqqgkNqzt', neo4j_host='ccdh-neo4j', neo4j_bolt_port='7687', redis_url='redis://127.0.0.1:6379', ccdhmodel_branch='main')

Possible solutions

This actually may be a problem on my docker config end. I'm investigating that now, trying out at least these 3 things:

  1. Make sure my redis url is correct, given docker setup. (currently FastAPIRedisCache says it can't connect!)
  2. Add 'link' to redis container
  3. Add depends_on health check to redis container

I'm not sure what else to try. Maybe there is something wrong with my setup in my Python code? Maybe something wrong with my docker setup?

How can I cache POST requests?

Hi, thank you for the library. Is there any way I can cache the POST requests as well? My use case is we do not change any resource per se in the server but we use POST to allow request body to have some parameters. There are high chances for repeated requests with the same combination. Can you please suggest how I can proceed with this using your library?

I also don't mind caching manually if the above solution is not possible. If manual caching is possible with your library, please let me know how I can do that.

Object of type <class '...'> is not JSON-serializable

Hello!

Just tried to use your package to cache frequent GET requests in my FastApi app.

Endpoint looks like this:

@router.get("/{id}", response_model=Malt)
@cache_one_hour()
def get_malt_by_id(*, db: Session = Depends(get_db), id: UUID4, ):
    return actions.malt.get(db=db, id=id)   # actions returns SQLAlchemy 'Malt' model which is converted to Pydantic 'Malt' schema after that, also tries to return Malt.from_orm(actions...) - same result

When I try to make requests I can see following in the logs:

FAILED_TO_CACHE_KEY: Object of type <class '....db.models.dictionaries.malt.Malt'> is not JSON-serializable: key=cache:mypkg.api.routes.dictionaries.malts.get_malt_by_id(id=7af9090c-dcb5-4778-b090-5b6890618566)

Pydantic schema looks like following (basically it is much more complex but I removed everything but name and still get same result.

class Malt(BaseModel):
    name: str

    class Config:
        orm_mode = True

My example seems to be quite the same to your example with User model from documentation. So why doesn't it work?

@cache decorator is getting skipped

Hello,

I am facing an error where the @cache decorator does not seem to be producing any output or writing any keys.

I started the Redis client according to the docs:

def startup():
    redis_cache = FastApiRedisCache()
    redis_cache.init(
        host_url = os.environ.get("REDIS_URL", LOCAL_REDIS_URL),
        prefix="multitenant_cache"
    )

and applied the decorator as follows:

@cache()
def GetNewOrder(request: Request):
   ...
    return Response(status_code = resp.status_code, content = resp.content, headers = cleanupRespHeaders(resp.headers))

When I check my server output, I see where redis_cache successfully connects to my Redis server, but I'm not getting any output that indicates that it's caching, and I'm not getting any new keys in my Redis database. Any idea if I'm missing some setup step or doing something else wrong here?

coroutine '<method that is cached>' was never awaited

Hey,
I am using your library and everything works fine for "normal" requests.

Problem arises, when I want to call cached method from within the same service.

My route:

@router.get('/get-fiat-currency-choices')
@cache()
def get_fiat_currency_choices():
    response_data = requests.get(f'{CURRENCY_EXCHANGE_API_URL}/currencies').json()

    choices = []
    for symbol, name in response_data.items():
        choices.append(
            {
                'symbol': symbol,
                'name': name,
            }
        )

    return choices

I am trying to call it with:

    @validator('currency')
    def validate_currency(cls, val):
        allowed_currencies = [currency['symbol'] for currency in get_fiat_currency_choices()]
        if val not in allowed_currencies:
            raise ValidationError('Currency is not supported.')
        return val

Error I am getting:

RuntimeWarning: coroutine 'get_fiat_currency_choices' was never awaited
2021-08-14T17:24:37.540579531Z   allowed_currencies = [currency['symbol'] for currency in get_fiat_currency_choices()]
2021-08-14T17:24:37.540585575Z RuntimeWarning: Enable tracemalloc to get the object allocation traceback

Would appreciate any help, thanks!

P.S. The same thing occurs when I am using similar library 'fastapi-cache2' (https://github.com/long2ice/fastapi-cache)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.