Code Monkey home page Code Monkey logo

drf-api-tracking's People

Contributors

anatolzak avatar aschn avatar avelis avatar brechard avatar brendanwood avatar cjh79 avatar dennybiasiolli avatar dependabot[bot] avatar elikeimig avatar fichie23 avatar frankie567 avatar gc-jeremy avatar gpt14 avatar ibest30 avatar itcrab avatar jameshiew avatar jemerick avatar lingster avatar mayankkapoor avatar null-none avatar rib3 avatar rwspielman avatar skadz avatar sourcery-ai-bot avatar tilboerner avatar tomage avatar tselepakis avatar vincentstark avatar yrchen avatar yuekui avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

drf-api-tracking's Issues

Getting deprecated warning about default app config from Django 3.2

I am getting this Warning
django.utils.deprecation.RemovedInDjango41Warning: 'rest_framework_tracking' defines default_app_config = 'rest_framework_tracking.apps.RestFrameworkTrackingConfig'. Django now detects this configuration automatically. You can remove default_app_config.

I am using Django 3.2.8
It seems Django Automatically Discovers AppConfig from version >3, So we do not need to define explicitly for Django versions > 3

Fix for 'NoneType' object has no attribute 'is_anonymous'

With LoggingMixin set on my ViewSet, if an anonymous user hits an endpoint, drf_api_tracking throws an error:

'NoneType' object has no attribute 'is_anonymous'

It comes from BaseLoggingMixin._get_user(), specifically the line where it does if user.is_anonymous: since user is None in this case it fails.

A temporary work around for now is to do a monkey patch like so:

# monkey patch a bug in rest_framework_tracking
def fixed_get_user(self, request):
    user = request.user
    if not user or user.is_anonymous:
        return None
    return user

BaseLoggingMixin._get_user = fixed_get_user

How to add new data to log?

How to change the models to log new data like OS or anything else.?
I saw in mixin, you are calling

super(BaseLoggingMixin, self).__init__(*args, **kwargs) 

in init . Does not it calls itself?

Serialize APIRequestLog

It is possible to serialize APIRequestLog when i try to create a serialize only user.id is returning:
Here is my class:
class ApiRequestTableSerializer(serializers.Serializer):
requested_at = serializers.DateTimeField(format='%d %b %Y')
path = serializers.CharField(source="get_path", read_only=True)

class Meta:
    model = APIRequestLog
    fields = ['id', 'requested_at', 'response_ms', 'status_code', 'method','user'
              'path', 'remote_addr', 'host',
              'query_params']

Handle ContentNotRenderedError in finalize_response

Hi there,

This is a tricky one, it took a while to debug :)

I was getting ContentNotRenderedError (The response content must be rendered before it can be accessed.) when trying to access an endpoint that does not support GET (using a GET request using the browser). The endpoint looks like:

class JobViewSet(
    LoggingMixin,
    mixins.CreateModelMixin,
    mixins.UpdateModelMixin,
    viewsets.GenericViewSet
):
    ...  

This should return a 405 response but drf-api-tracking is failing because the DRF BrowsableAPI is trying to show a form with initial data and the serializer is failing. This is:

  • If you fix the serializer code the error is not triggered
  • If you skip the browsable API using ?format=json the error is not triggered

The failing code is response.getvalue():

def finalize_response(self, request, response, *args, **kwargs):
    ...
    if should_log(request, response):
        ...
        else:
            rendered_content = response.getvalue()

Yet in this corner case and probably others it would be wise to have some protection, something like this:

def finalize_response(self, request, response, *args, **kwargs):
    ...
    if should_log(request, response):
        ...
        else:
            # import here for brevity
            from django.template.response import ContentNotRenderedError
            try:
                rendered_content = response.getvalue()
            except ContentNotRenderedError:
                rendered_content = ""

Then the response changes to:

'JobSerializer' object has no attribute 'initial_data'

And that really gives a clue that the problem is in your code, and at the same time the 405 response is logged.

Would you accept a PR with that solution?

Using this with Django Logger.

First, I want to say thank you for maintaining this package.

I wanted to know about ways I can use this package to integrate with the Django Logger, more specifically wachtower. I am wanting to use AWS CloudWatch to track API usage. Since I have a big application at the moment, would I have to go into each view, add the mixin and then override the handle_log function or is there a way I can set this by default somewhere?

Appreciate the help. Love the package. 😊

Logs are not being created on the table

Hey,

I am using Postgres with DRF and Django and trying to make this work on an APIView.

However, it does not log any detail.

I have followed the documentation and tried to debug a lot before reaching out here.

What are the things I should ensure to make this work?

Why is it not working in my case?

What could I be doing wrong?

I have tried using debugger inside the handle_log method in LoggingMixin but I don't get the debugger.

DataError: invalid input syntax for type inet

Request headers from some services, e.g. Google Cloud, include a list of IP addresses in the REMOTE_ADDR field. When the log information is saved to the logging table an error occurs as the format of the IP field does not match type inet in the database table. This causes an error and the information is not saved.

I've made a pull request with a proposed fix: #55
#55

Traceback below, with IP addresses blanked out:

Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/django/db/backends/utils.py", line 84, in _execute
return self.cursor.execute(sql, params)

The above exception (invalid input syntax for type inet: "xx.xx.xx.xx, yyy.yyy.yyy.yy"
LINE 1: ...DataView', 'get', 'xx.xx.xx....
^
) was the direct cause of the following exception:
File "/usr/local/lib/python3.9/site-packages/rest_framework_tracking/base_mixins.py", line 83, in finalize_response
self.handle_log()
File "/usr/local/lib/python3.9/site-packages/rest_framework_tracking/mixins.py", line 12, in handle_log
APIRequestLog(**self.log).save()
File "/usr/local/lib/python3.9/site-packages/django/db/models/base.py", line 750, in save
self.save_base(using=using, force_insert=force_insert,
...

BaseLoggingMixin causing RequestDataTooBig exception

Can someone please help me to understand the purpose of:
self.log["data"] = self._clean_data(request.body)

in the initial method of the BaseLoggingMixin?

I'm accepting large files and self._clean.data(request.body) is raising a RequestDataTooBig exception.

What I don't understand is why we're calling it in the first place, since we're going to overwrite self.log["data"] almost immediately anyways:

self.log["data"] = self._clean_data(request.body)
        super(BaseLoggingMixin, self).initial(request, *args, **kwargs)

        try:
            data = self.request.data.dict()

        except AttributeError:
            data = self.request.data
        self.log["data"] = self._clean_data(data)

If someone could provide me a little insight, I'd appreciate it.

Django 4.2 support

Is anyone already working on Django 4.2 support? I don't want to duplicate effort and was looking at it today.

So far the issue is the reference to django.utils.six, which no longer exists. Ah, that's version 1.8.0 -- 1.8.2 seems to remove that already and explicitly disallows Django 4.x in its requirements.

API logs graph not appearing in Django admin when using custom admin site

Hello,

I'm using a custom admin site to extend the Django admin login form with a Google reCaptcha. Before I added this, the APIRequestLog was automatically registered and I didn't have to provide the code below. However, now I cannot see the datepicker or bar graph. Can anyone help with getting this back whilst using my custom admin site?

admin.py:

class APIRequestLogAdmin(admin.ModelAdmin):
    list_display = ('requested_at', 'response_ms', 'path', 'remote_addr', 'host', 'method', 'query_params', 'data', 'errors', 'user', 'user_agent')

class CustomAdminSite(admin.AdminSite):
    login_form = CustomAdminLoginForm

    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self._registry.update(admin.site._registry)

admin_site = CustomAdminSite(name='admin')
admin_site.register(APIRequestLog, APIRequestLogAdmin)

image

Thanks,
James

ModuleNotFoundError: No module named 'django.utils.six'

Hello,
I'm working with Python 3.7.4, Django 3.0.2, djangorestframework 3.11.0
I have a problems with django.utils.six in base_models.py

from django.utils.six import python_2_unicode_compatible
ModuleNotFoundError: No module named 'django.utils.six'

But, if I remove this lines:

from django.utils.six import python_2_unicode_compatible
@python_2_unicode_compatible

It works fine.

I suggest considering the version to import or not the module:

from django.db import models
from django.conf import settings
from .managers import PrefetchUserManager

import sys
print("sys.version: ", sys.version)
if sys.version < '3':
    from django.utils.six import python_2_unicode_compatible
    @python_2_unicode_compatible
    class BaseAPIRequestLog(models.Model):
        """ Logs Django rest framework API requests """
        user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.SET_NULL, null=True,
                                 blank=True)
        requested_at = models.DateTimeField(db_index=True)
        response_ms = models.PositiveIntegerField(default=0)
        path = models.CharField(max_length=200, db_index=True)
        view = models.CharField(
            max_length=200,
            null=True,
            blank=True,
            db_index=True,
        )

        # NOTE: Choosing the longest verb in English language - ought be good
        #       enough for a while
        VIEW_METHOD_MAX_LENGTH = len('Floccinaucinihilipilificate')
        view_method = models.CharField(
            max_length=VIEW_METHOD_MAX_LENGTH,
            null=True,
            blank=True,
            db_index=True,
        )
        remote_addr = models.GenericIPAddressField()
        host = models.URLField()
        method = models.CharField(max_length=10)
        query_params = models.TextField(null=True, blank=True)
        data = models.TextField(null=True, blank=True)
        response = models.TextField(null=True, blank=True)
        errors = models.TextField(null=True, blank=True)
        status_code = models.PositiveIntegerField(null=True, blank=True)
        objects = PrefetchUserManager()

        class Meta:
            abstract = True
            verbose_name = 'API Request Log'

        def __str__(self):
            return '{} {}'.format(self.method, self.path)
else:
    class BaseAPIRequestLog(models.Model):
        """ Logs Django rest framework API requests """
        user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.SET_NULL, null=True,
                                 blank=True)
        requested_at = models.DateTimeField(db_index=True)
        response_ms = models.PositiveIntegerField(default=0)
        path = models.CharField(max_length=200, db_index=True)
        view = models.CharField(
            max_length=200,
            null=True,
            blank=True,
            db_index=True,
        )

        # NOTE: Choosing the longest verb in English language - ought be good
        #       enough for a while
        VIEW_METHOD_MAX_LENGTH = len('Floccinaucinihilipilificate')
        view_method = models.CharField(
            max_length=VIEW_METHOD_MAX_LENGTH,
            null=True,
            blank=True,
            db_index=True,
        )
        remote_addr = models.GenericIPAddressField()
        host = models.URLField()
        method = models.CharField(max_length=10)
        query_params = models.TextField(null=True, blank=True)
        data = models.TextField(null=True, blank=True)
        response = models.TextField(null=True, blank=True)
        errors = models.TextField(null=True, blank=True)
        status_code = models.PositiveIntegerField(null=True, blank=True)
        objects = PrefetchUserManager()

        class Meta:
            abstract = True
            verbose_name = 'API Request Log'

        def __str__(self):
            return '{} {}'.format(self.method, self.path)

Analyzing Logs

What's the best way to generate reports from the logging for things like session time, daily/monthly users, endpoint hits, etc?

When is the new version released?

I found this library that I think is amazing but I really need the just added feature for the decode_request_body. The version in pypi hasn't been updated for a while and I would like to know when do you plan on releasing a new version.

Convert data and query fields to jsonfield

Currently for the query and data fields these are stored as text, it would make sense to have these stored as json fields to ease searching.
Add test cases to verify

X_FORWARDED_FOR into list and store Client IP

Per MDN docs, X_FORWARDED_FOR can be a list and thus breaks when trying to store in remote_addr which is a GenericIPAddressField. The first value in the X_FORWARDED_FOR list will be the client ip, and the remaining are all proxy IPs and can be able to be stripped away. I've opened a PR to fix this: #26

Delete logs periodicaly

Handling views with too many calls is hard. There exists no procedure to delete records periodically. This would be a very useful feature to delete records of logs periodically.

Solution:
we could have a variable called delete_log_days_period in LoggingMixin that is None by default. If the programmer wants to delete that view API log reports that are for before delete_log_days_period days before now, he set this variable to an integer. then records are deleted periodically.

Requests from IPv6 IP addresses not logged

Problem is the fix to #12, which breaks logging on any IPv6 addresses since they are colon-separated hex numbers and results in an exception like (on postgres):

psycopg2.errors.InvalidTextRepresentation: invalid input syntax for type inet: "2604"

I am working on a PR to fix this which I should have ready shortly. I am filing this issue in case someone googles that exception. And unfortunately this means that since that fix, anyone using this has been missing out on logs from any IPv6 addresses...

is DRF version 3.11 supported?

I do not get any logging output on my DigitalOcean server for hosting DRF api. This is a blocker in my development since without the logging from drf-api-tracking my requests becomes a black box.

Following set-up is employed on the server:
Python 3.6
Django 3.0.6
DRF 3.11

Please help me, I'm stuck

Access the origin header field?

Hi,
Is there a way to log the Origin header value from the logged request ?
We have an API that is used by external users, but also by our website and we would like to be able to make the difference easily. The Origin value would be an easy way to do this.
Thanks!

requested_at and remote_addr fields in base_model are not null

copied over from: aschn/drf-tracking#141

In my case, when has IP white lists, the program can't get the remote_addr. But in the base_model, the field is setted to not null.
remote_addr = models.GenericIPAddressField()
the requested_at is the same.
requested_at = models.DateTimeField(db_index=True)
@niubencoolboy
Author
niubencoolboy commented on 24 Jun 2019

So, I changed the two codes to blew:
from django.utils.timezone import now

requested_at = models.DateTimeField(default=now, db_index=True)

remote_addr = models.GenericIPAddressField(null=True, blank=True)

how to use with function based views?

The documentation mentions the use of mixins that can only be used by the class-based views.
Is there a way of using this with function-based views?

how to convert to dictionary

so im getting all the logs response and I tried using import json but it does not work since its not literally converted to json. I also tried ast it works if all were string but it will error when it encounters <InMemory...> so any idea how to revert it back? Thank you

Storing Correlation ID

It would be a good enhancement if we can link/add Correlation ID which can be either generated or extracted from header (e.g. set by nginx) So that it can be used for linking it with other Django log apps and Django logging etc.

DRF Tracking fails on Azure

I followed these instructions to create a django project on Azure:
https://docs.microsoft.com/en-us/azure/app-service/containers/tutorial-python-postgresql-app

My requirements.txt:

django==2.2.7
djangorestframework==3.9.4
djangorestframework-jwt==1.11.0
drf-extensions==0.5.0
drf-tracking==1.5.0
pytz==2019.1 # for drf-tracking
sqlparse==0.3.0 # for drf-tracking

snippet from views.py for how I load and all the logging:

...
from rest_framework_tracking.mixins import LoggingMixin, BaseLoggingMixin
from rest_framework_tracking.models import APIRequestLog
...

class TagViewSet(LoggingMixin, viewsets.ModelViewSet):
    logging_methods = ['POST', 'PUT', 'PATCH', 'DELETE']
    queryset = TechnologyTag.objects.all()
    serializer_class = TagSerializer
    permission_classes = [permissions.IsAuthenticated,IsAdminOrReadOnly]
    filter_class = TagFilter

Everything works great locally, but on Azure, nothing is ever written to the log. The reason is hiding in this error:

2020-03-10T14:59:03.135990572Z Logging API call raise exception!
2020-03-10T14:59:03.136026572Z Traceback (most recent call last):
2020-03-10T14:59:03.136032272Z   File "/antenv/lib/python3.7/site-packages/django/db/backends/utils.py", line 84, in _execute
2020-03-10T14:59:03.136036372Z     return self.cursor.execute(sql, params)
2020-03-10T14:59:03.136040272Z psycopg2.errors.InvalidTextRepresentation: invalid input syntax for type inet: "24.23.134.119:43398"
2020-03-10T14:59:03.136044372Z LINE 1: ...e/', 'portola.views.DocumentViewSet', 'disclose', '24.23.134...
2020-03-10T14:59:03.136048272Z                                                              ^
2020-03-10T14:59:03.136052072Z
2020-03-10T14:59:03.136055472Z
2020-03-10T14:59:03.136058972Z The above exception was the direct cause of the following exception:
2020-03-10T14:59:03.136062672Z
2020-03-10T14:59:03.136066171Z Traceback (most recent call last):
2020-03-10T14:59:03.136069771Z   File "/antenv/lib/python3.7/site-packages/rest_framework_tracking/base_mixins.py", line 78, in finalize_response
2020-03-10T14:59:03.136073671Z     self.handle_log()
2020-03-10T14:59:03.136089071Z   File "/antenv/lib/python3.7/site-packages/rest_framework_tracking/mixins.py", line 12, in handle_log
2020-03-10T14:59:03.136093371Z     APIRequestLog(**self.log).save()
2020-03-10T14:59:03.136096871Z   File "/antenv/lib/python3.7/site-packages/django/db/models/base.py", line 741, in save
2020-03-10T14:59:03.136100571Z     force_update=force_update, update_fields=update_fields)
2020-03-10T14:59:03.136105271Z   File "/antenv/lib/python3.7/site-packages/django/db/models/base.py", line 779, in save_base
2020-03-10T14:59:03.136108971Z     force_update, using, update_fields,
2020-03-10T14:59:03.136112371Z   File "/antenv/lib/python3.7/site-packages/django/db/models/base.py", line 870, in _save_table
2020-03-10T14:59:03.136115971Z     result = self._do_insert(cls._base_manager, using, fields, update_pk, raw)
2020-03-10T14:59:03.136119571Z   File "/antenv/lib/python3.7/site-packages/django/db/models/base.py", line 908, in _do_insert
2020-03-10T14:59:03.136123271Z     using=using, raw=raw)
2020-03-10T14:59:03.136126671Z   File "/antenv/lib/python3.7/site-packages/django/db/models/manager.py", line 82, in manager_method
2020-03-10T14:59:03.136130371Z     return getattr(self.get_queryset(), name)(*args, **kwargs)
2020-03-10T14:59:03.136133871Z   File "/antenv/lib/python3.7/site-packages/django/db/models/query.py", line 1186, in _insert
2020-03-10T14:59:03.136137471Z     return query.get_compiler(using=using).execute_sql(return_id)
2020-03-10T14:59:03.136140971Z   File "/antenv/lib/python3.7/site-packages/django/db/models/sql/compiler.py", line 1335, in execute_sql
2020-03-10T14:59:03.136144671Z     cursor.execute(sql, params)
2020-03-10T14:59:03.136148071Z   File "/antenv/lib/python3.7/site-packages/django/db/backends/utils.py", line 67, in execute
2020-03-10T14:59:03.136152571Z     return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
2020-03-10T14:59:03.136156271Z   File "/antenv/lib/python3.7/site-packages/django/db/backends/utils.py", line 76, in _execute_with_wrappers
2020-03-10T14:59:03.136159871Z     return executor(sql, params, many, context)
2020-03-10T14:59:03.136163371Z   File "/antenv/lib/python3.7/site-packages/django/db/backends/utils.py", line 84, in _execute
2020-03-10T14:59:03.136166971Z     return self.cursor.execute(sql, params)
2020-03-10T14:59:03.136170371Z   File "/antenv/lib/python3.7/site-packages/django/db/utils.py", line 89, in __exit__
2020-03-10T14:59:03.136173971Z     raise dj_exc_value.with_traceback(traceback) from exc_value
2020-03-10T14:59:03.136177471Z   File "/antenv/lib/python3.7/site-packages/django/db/backends/utils.py", line 84, in _execute
2020-03-10T14:59:03.136181071Z     return self.cursor.execute(sql, params)
2020-03-10T14:59:03.136184471Z django.db.utils.DataError: invalid input syntax for type inet: "24.23.134.119:43398"
2020-03-10T14:59:03.136188171Z LINE 1: ...e/', 'portola.views.DocumentViewSet', 'disclose', '24.23.134...
2020-03-10T14:59:03.136194471Z                                                              ^
2020-03-10T14:59:03.136198171Z

The core of the problem is this psycopg2 exception:
psycopg2.errors.InvalidTextRepresentation: invalid input syntax for type inet: "24.23.134.119:43398"
which seems to be complaining about the port number that is being attached to the IP address via the middleware they're using to serve static pages.

Possible similar errors I have found googling this problem refer to nginx configuration.

Issue moved from aschn/drf-tracking#150 with additional details added.

Django3.2+DRF3.14

Excuse me, when can I adapt to Django3.2+DRF3.14?
Is there a plan underway?

Sensitive fields not cleaned from response body

Description:

Summary: When returning sensitive fields in the response to any APIView, the sensitive fields do not get cleaned.

Steps to reproduce: Create TestView as below, and notice the output from the print statement

from django.shortcuts import render
from rest_framework.response import Response
from rest_framework.views import APIView
from rest_framework_tracking.mixins import LoggingMixin

# Create your views here.
class TestView(LoggingMixin, APIView):
    def get(self, request):
        return Response({"key": "password"}, status=200)

    def handle_log(self):
        super().handle_log()
        print(self.log)

Output:
{'requested_at': datetime.datetime(2023, 6, 9, 17, 20, 11, 319782, tzinfo=<UTC>), 'data': {}, 'remote_addr': <removed by myself>, 'view': 'api.views.TestView', 'view_method': 'get', 'path': '/test/articles/', 'host': '0.0.0.0:8000', 'user_agent': 'HTTPie/3.2.1', 'method': 'GET', 'query_params': {}, 'user': None, 'username_persistent': 'Anonymous', 'response_ms': 0, 'response': '{"key":"password"}', 'status_code': 200}

Expected behavior: Password should be hidden e.g.

'response': '{"key":"*******"}'

Actual behavior: Password is not hidden

'response': '{"key":"password"}'

Environment: Happens in all environments (even locally)

Additional context:

It seems the issue stems from the updating of the log/cleaning of the data. The "response" key gets rendered_content which is type <class 'bytes'>. When the data is decoded from bytes it is decoded into a string in the form '{"key":"password"}'. Since it is a string, it is not covered by any of the clean logic.


            self.log.update(
                {
                    "remote_addr": self._get_ip_address(request),
                    "view": self._get_view_name(request),
                    "view_method": self._get_view_method(request),
                    "path": self._get_path(request),
                    "host": request.get_host(),
                    "user_agent": request.META.get("HTTP_USER_AGENT", ""),
                    "method": request.method,
                    "query_params": self._clean_data(request.query_params.dict()),
                    "user": user,
                    "username_persistent": user.get_username() if user else "Anonymous",
                    "response_ms": self._get_response_ms(),
                    "response": self._clean_data(rendered_content),
                    "status_code": response.status_code,
                }
            )
    def _clean_data(self, data):
        """
        Clean a dictionary of data of potentially sensitive info before
        sending to the database.
        Function based on the "_clean_credentials" function of django
        (https://github.com/django/django/blob/stable/1.11.x/django/contrib/auth/__init__.py#L50)

        Fields defined by django are by default cleaned with this function

        You can define your own sensitive fields in your view by defining a set
        eg: sensitive_fields = {'field1', 'field2'}
        """
        if isinstance(data, bytes):
            data = data.decode(errors="replace")

        if isinstance(data, list):
            return [self._clean_data(d) for d in data]

        if isinstance(data, dict):
            SENSITIVE_FIELDS = {
                "api",
                "token",
                "key",
                "secret",
                "password",
                "signature",
            }

            data = dict(data)
            if self.sensitive_fields:
                SENSITIVE_FIELDS = SENSITIVE_FIELDS | {
                    field.lower() for field in self.sensitive_fields
                }

            for key, value in data.items():
                try:
                    value = ast.literal_eval(value)
                except (ValueError, SyntaxError):
                    pass
                if isinstance(value, (list, dict)):
                    data[key] = self._clean_data(value)
                if key.lower() in SENSITIVE_FIELDS:
                    data[key] = self.CLEANED_SUBSTITUTE
        return data

Please let me know if I'm not understanding something properly/using this wrong.

1.8.2 installation is broken (No file/folder found for package drf-api-tracking)

Our pipelines have started failing since 1.8.2 was released (we do not pin bugfix versions). The installation also fails locally.

Current fix is to pin version 1.8.0.

Collecting drf-api-tracking<1.9 (from -r requirements.txt (line 42))
  Downloading drf-api-tracking-1.8.2.tar.gz (15 kB)
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'error'
  error: subprocess-exited-with-error
  
  Γ— Preparing metadata (pyproject.toml) did not run successfully.
  β”‚ exit code: 1
  ╰─> [20 lines of output]
      Traceback (most recent call last):
        File "/usr/local/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/usr/local/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/usr/local/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 149, in prepare_metadata_for_build_wheel
          return hook(metadata_directory, config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-86aoi2pd/overlay/lib/python3.11/site-packages/poetry/core/masonry/api.py", line 41, in prepare_metadata_for_build_wheel
          builder = WheelBuilder(poetry)
                    ^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-86aoi2pd/overlay/lib/python3.11/site-packages/poetry/core/masonry/builders/wheel.py", line 59, in __init__
          super().__init__(poetry, executable=executable)
        File "/tmp/pip-build-env-86aoi2pd/overlay/lib/python3.11/site-packages/poetry/core/masonry/builders/builder.py", line 83, in __init__
          self._module = Module(
                         ^^^^^^^
        File "/tmp/pip-build-env-86aoi2pd/overlay/lib/python3.11/site-packages/poetry/core/masonry/utils/module.py", line 69, in __init__
          raise ModuleOrPackageNotFound(
      poetry.core.masonry.utils.module.ModuleOrPackageNotFound: No file/folder found for package drf-api-tracking
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

Version 1.8.3 is still broken

When I'm trying to install it says:

Requested drf-api-tracking==1.8.3 from https://files.pythonhosted.org/packages/29/0e/5d5f34a3eb2ee1c37c9948200bef0f41237affd1d71a2cd76ac9a916d9f7/drf-api-tracking-1.8.3.tar.gz 
has inconsistent version: expected '1.8.3', but metadata has '1.8.2'

@lingster I checked - there's a 1.8.2 version in the __init__.py.

Logged Request Response saved as HTML

Accessing a view with the LoggingMixin through the Browsable API results in the Response saved as HTML instead of JSON.

Is there a way to save the JSON response when requests are made through Browsable API?

Log custom headers

Some requests might include additional, useful information about the client in some custom HTTP header, say X-Client-Name and X-Client-Version. There is no way to currently get that information.

I see two main approaches, both based on a JSON field to store headers (that would restrict this feature to the DB backends that support JSON fields, but it's already been suggested to use them in #2):

  • Should we log all headers? This could significantly increase the size of a log entry.
  • Should we allow the user to specify which headers he's interested in? In settings? This requires some setup but allows to target only useful headers, so that's my favorite approach so far.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.