Code Monkey home page Code Monkey logo

fastapi-production-boilerplate's Introduction



fastapi-production-boilerplate's People

Contributors

dependabot[bot] avatar dka58 avatar iam-abbas avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

fastapi-production-boilerplate's Issues

async Select error

register user

TypeError: object Select can't be used in 'await' expression

It's working if I remove await in this line:
query = await self._query(join_)

or if I change the _query function to async.

what do you suggest, lord

also, do you the code and structure are still production-ready? what do you think of a monolithic folder structure like this:

https://github.com/zhanymkanov/fastapi-best-practices

Also, I'd love to know about when to use Transactional and Propogation new

Celery Task Did Not Register

I made a small example to test out celery workers. It should get universiies based on the country I query. However, it did not seem to register my task. Do you have any ideas why that would be ?

I created a test_celery.py in tasks directory:

from celery import shared_task
from typing import List, Optional
import json
import httpx
from pydantic import BaseModel
from worker import celery_app

@celery_app.task(name="get_all_universities_task", bind=True)
def get_all_universities_task(self, country):
    return get_all_universities_for_country(country)

url = 'http://universities.hipolabs.com/search'


def get_all_universities_for_country(country: str) -> dict:
    print('get_all_universities_for_country ', country)
    params = {'country': country}
    client = httpx.Client()
    response = client.get(url, params=params)
    response_json = json.loads(response.text)
    universities = []
    for university in response_json:
        university_obj = University.parse_obj(university)
        universities.append(university_obj)
    return {country: universities}

class University(BaseModel):
    """Model representing a university."""

    country: Optional[str] = None  # Optional country name
    web_pages: List[str] = []  # List of web page URLs
    name: Optional[str] = None  # Optional university name
    alpha_two_code: Optional[str] = None  # Optional alpha two-code
    domains: List[str] = []  # List of domain names

And I created an API endpoint and api/v1 directory. That endpoint triggers the tasks

from fastapi import APIRouter
from worker.tasks.test_celery import get_all_universities_task
from celery.result import AsyncResult
import redis


test_celery_router = APIRouter()

@test_celery_router.get("/", status_code=200)
async def test_celery(country):
    task = get_all_universities_task.delay(country)
    return {"task_id": task.id, "task_status": task.status}

@test_celery_router.get("/status/{task_id}", status_code=200)
async def test_celery_status(task_id):
    async_result = AsyncResult(task_id)
    if async_result.ready():
        result_value = async_result.get()
        return {"task_id": task_id, "task_status": async_result.status, "task_result": result_value}
    else:
        return {"task_id": task_id, "task_status": async_result.status, "task_result": "Not ready yet!"}

When I ran make celery-worker, it seems to me that it did not register the task "get_all_univerisity_task":

❯ make celery-worker
poetry run celery -A worker worker -l info


[email protected] v5.3.1 (emerald-rush)

macOS-10.16-x86_64-i386-64bit 2023-06-20 17:58:19

[config]
.> app:         worker:0x10cb337d0
.> transport:   amqp://guest:**@localhost:5672//
.> results:     redis://localhost:6379/0
.> concurrency: 10 (prefork)
.> task events: OFF (enable -E to monitor tasks in this worker)

[queues]
.> celery           exchange=celery(direct) key=celery


[tasks]


[2023-06-20 17:58:19,735: WARNING/MainProcess] /Users/quankhuc/anaconda3/envs/TradingBot/lib/python3.11/site-packages/celery/worker/consumer/consumer.py:498: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
  warnings.warn(

[2023-06-20 17:58:19,807: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2023-06-20 17:58:19,808: WARNING/MainProcess] /Users/quankhuc/anaconda3/envs/TradingBot/lib/python3.11/site-packages/celery/worker/consumer/consumer.py:498: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
  warnings.warn(

[2023-06-20 17:58:19,813: INFO/MainProcess] mingle: searching for neighbors
[2023-06-20 17:58:20,840: INFO/MainProcess] mingle: all alone
[2023-06-20 17:58:20,864: INFO/MainProcess] [email protected] ready.
[2023-06-20 17:58:21,142: INFO/MainProcess] Events of group {task} enabled by remote.
[2023-06-20 18:05:38,090: ERROR/MainProcess] Received unregistered task of type 'get_all_universities_task'.

The message has been ignored and discarded.

Did you remember to import the module containing this task?
Or maybe you're using relative imports?

Please see
https://docs.celeryq.dev/en/latest/internals/protocol.html
for more information.

The full contents of the message body was:
'[["vietnam"], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (86b)

The full contents of the message headers:
{'lang': 'py', 'task': 'get_all_universities_task', 'id': '18094bcf-9a99-4330-94b8-96acc928edca', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'group_index': None, 'retries': 0, 'timelimit': [None, None], 'root_id': '18094bcf-9a99-4330-94b8-96acc928edca', 'parent_id': None, 'argsrepr': "('vietnam',)", 'kwargsrepr': '{}', 'origin': '[email protected]', 'ignore_result': False, 'stamped_headers': None, 'stamps': {}}

The delivery info for this task is:
{'consumer_tag': 'None4', 'delivery_tag': 1, 'redelivered': False, 'exchange': '', 'routing_key': 'celery'}
Traceback (most recent call last):
  File "/Users/quankhuc/anaconda3/envs/TradingBot/lib/python3.11/site-packages/celery/worker/consumer/consumer.py", line 642, in on_task_received
    strategy = strategies[type_]
               ~~~~~~~~~~^^^^^^^
KeyError: 'get_all_universities_task'

poetry install error

Poetry (version 1.4.1)
Python: 3.11.2
Implementation: CPython
Platform: linux
OS: posix
OS Distribution: Ubuntu 22.04.2 LTS

When running: poetry install some errors occured:

Package operations: 50 installs, 1 update, 0 removals

• Installing parse (1.19.0): Failed

ChefBuildError

   from _ctypes import Union, Structure, Array

ModuleNotFoundError: No module named '_ctypes'

at ~/.pyenv/versions/3.11.2/lib/python3.11/site-packages/poetry/installation/chef.py:152 in _prepare
148│
149│ error = ChefBuildError("\n\n".join(message_parts))
150│
151│ if error is not None:
→ 152│ raise error from None
153│
154│ return path
155│
156│ def _prepare_sdist(self, archive: Path, destination: Path | None = None) -> Path:

Note: This error originates from the build backend, and is likely not a problem with poetry but with parse (1.19.0) not supporting PEP 517 builds. You can verify this by running 'pip wheel --use-pep517 "parse (==1.19.0)"'.

• Installing wrapt (1.14.1): Failed

ChefBuildError

Backend subprocess exited when trying to invoke build_wheel
at ~/.pyenv/versions/3.11.2/lib/python3.11/site-packages/poetry/installation/chef.py:152 in _prepare
148│
149│ error = ChefBuildError("\n\n".join(message_parts))
150│
151│ if error is not None:
→ 152│ raise error from None
153│
154│ return path
155│
156│ def _prepare_sdist(self, archive: Path, destination: Path | None = None) -> Path:

Note: This error originates from the build backend, and is likely not a problem with poetry but with wrapt (1.14.1) not supporting PEP 517 builds. You can verify this by running 'pip wheel --use-pep517 "wrapt (==1.14.1)"'.

Authentication Workflow

I am able to start the server and trying all of the endpoints. I saw that the \users and tasks required authentication. How can I get the token generated to try out these 2 endpoints at swagger? Right now it will give me unauthorized response, because I don't have the token.

make run
poetry run python main.py
INFO:     Will watch for changes in these directories: ['/Users/quankhuc/FastAPI-Production-Boilerplate']
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [61146] using WatchFiles
INFO:     Started server process [61152]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     127.0.0.1:52430 - "GET /docs HTTP/1.1" 200 OK
INFO:     127.0.0.1:52430 - "GET /openapi.json HTTP/1.1" 200 OK
INFO:     127.0.0.1:52430 - "GET /v1/users/me HTTP/1.1" 401 Unauthorized
INFO:     127.0.0.1:52563 - "GET /v1/tasks/ HTTP/1.1" 401 Unauthorized

Authentication Middleware

Why don't we use the JWTHandler class in the authentication middleware to decode the received token, so we can also handle exceptions properly?

Is there some kind of trick in order to regenerate the OpenAPI schema?

Thanks for this repo, it is great.

I created a new model, a new Repository wrapping the model, a controller, all the different pieces, but the openapi schema and thus the /docs endpoint just shows the first models that exist in the app.

I was wondering if you have any ways to debug the OpenAPI spec not generating.

Thank you!

Question: What is the difference between the two Propagation types?

In the @Transactional annotation, the code supports two Propagation types: REQUIRED and REQUIRED_NEW. The only difference appears to be that REQUIRED_NEW starts a session, whereas REQUIRED assumes one is already started. The commit log / documentation did not help me understand which one I should use, under what circumstances. Can you help?

Thanks!

ImportError: cannot import name 'TypeAliasType' from 'typing_extensions'

When i run make migrate, i got this error

$ make migrate poetry run alembic upgrade head Traceback (most recent call last): File "<frozen runpy>", line 198, in _run_module_as_main File "<frozen runpy>", line 88, in _run_code File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Scripts\alembic.exe\__main__.py", line 4, in <module> File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\alembic\__init__.py", line 1, in <module> from . import context File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\alembic\context.py", line 1, in <module> from .runtime.environment import EnvironmentContext File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\alembic\runtime\environment.py", line 19, in <module> from sqlalchemy.sql.schema import Column File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\sqlalchemy\__init__.py", line 12, in <module> from . import util as _util File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\sqlalchemy\util\__init__.py", line 15, in <module> from ._collections import coerce_generator_arg as coerce_generator_arg File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\sqlalchemy\util\_collections.py", line 39, in <module> from .typing import is_non_string_iterable File "C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\sqlalchemy\util\typing.py", line 56, in <module> from typing_extensions import TypeAliasType as TypeAliasType # 3.12 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ImportError: cannot import name 'TypeAliasType' from 'typing_extensions' (C:\Users\DELL\AppData\Local\pypoetry\Cache\virtualenvs\bicxchange-PYV2jhZE-py3.11\Lib\site-packages\typing_extensions.py) make: *** [Makefile:55: migrate] Error 1

problems with join_ or i cant get it

join_ is not working well, i am trying to join_{'tasks'}, but i got the error

query = query.filter(User.id == user_id)
        ^^^^^^^^^^^^

AttributeError: 'coroutine' object has no attribute 'filter'

Not able to run on ARM M1 Mac

Hi,

Thank you for making a production grade boilerplate available. I ran into troubles when I tried to start this project on my Mac M1 computer. It seems to me that there is a configuration that you have on the project is for machine of type x86-64, when the poetry enviroment is set up. How can I change the configuration to install packages for M1 machine?

Steps to reproduce:

Follow the jump start instruction at README. When running make migrate, this error happened:

  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/bin/alembic", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/config.py", line 591, in main
    CommandLine(prog=prog).main(argv=argv)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/config.py", line 585, in main
    self.run_cmd(cfg, options)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/config.py", line 562, in run_cmd
    fn(
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/command.py", line 378, in upgrade
    script.run_env()
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/script/base.py", line 569, in run_env
    util.load_python_file(self.dir, "env.py")
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/util/pyfiles.py", line 94, in load_python_file
    module = load_module_py(module_id, path)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/alembic/util/pyfiles.py", line 110, in load_module_py
    spec.loader.exec_module(module)  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/Users/quankhuc/FastAPI-Production-Boilerplate/migrations/env.py", line 29, in <module>
    from app.models import Base
  File "/Users/quankhuc/FastAPI-Production-Boilerplate/app/models/__init__.py", line 1, in <module>
    from core.database import Base
  File "/Users/quankhuc/FastAPI-Production-Boilerplate/core/database/__init__.py", line 1, in <module>
    from .session import (
  File "/Users/quankhuc/FastAPI-Production-Boilerplate/core/database/session.py", line 30, in <module>
    "writer": create_async_engine(config.POSTGRES_URL, pool_recycle=3600),
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/ext/asyncio/engine.py", line 85, in create_async_engine
    sync_engine = _create_engine(url, **kw)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<string>", line 2, in create_engine
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/util/deprecations.py", line 277, in warned
    return fn(*args, **kwargs)  # type: ignore[no-any-return]
           ^^^^^^^^^^^^^^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/create.py", line 720, in create_engine
    event.listen(pool, "connect", on_connect)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/event/api.py", line 124, in listen
    _event_key(target, identifier, fn).listen(*args, **kw)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/event/registry.py", line 310, in listen
    self.dispatch_target.dispatch._listen(self, *args, **kw)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/event/base.py", line 177, in _listen
    return self._events._listen(event_key, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/events.py", line 94, in _listen
    event_key.base_listen(**kw)
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/event/registry.py", line 348, in base_listen
    for_modify._set_asyncio()
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/event/attr.py", line 416, in _set_asyncio
    self._exec_once_mutex = AsyncAdaptedLock()
                            ^^^^^^^^^^^^^^^^^^
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/util/concurrency.py", line 63, in AsyncAdaptedLock
    _not_implemented()
  File "/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/sqlalchemy/util/concurrency.py", line 43, in _not_implemented
    raise ValueError(
ValueError: the greenlet library is required to use this function. dlopen(/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/greenlet/_greenlet.cpython-311-darwin.so, 0x0002): tried: '/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/greenlet/_greenlet.cpython-311-darwin.so' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/greenlet/_greenlet.cpython-311-darwin.so' (no such file), '/Users/quankhuc/Library/Caches/pypoetry/virtualenvs/fastapi-boilerplate-ncg74GX8-py3.11/lib/python3.11/site-packages/greenlet/_greenlet.cpython-311-darwin.so' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))
make: *** [migrate] Error 1```

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.