Code Monkey home page Code Monkey logo

Comments (12)

ShibaBandit avatar ShibaBandit commented on July 18, 2024 3

From EC2/Container-style queue-reading backend process using:

from aws_xray_sdk.core import xray_recorder
from aws_xray_sdk.core import patch_all

xray_recorder.configure(context_missing='LOG_ERROR', plugins=('EC2Plugin', 'ECSPlugin', 'ElasticBeanstalkPlugin'))
patch_all()
...
xray_recorder.begin_segment(name="Test")
...
s3 = boto3.resource('s3', region_name='us-east-1')
bucket = s3.Bucket('bucket_name_here')
...
bucket.download_file(obj.key, filename)
...
xray_recorder.end_segment()

fails with:

ERROR:aws_xray_sdk.core.context:cannot find the current segment/subsegment, please make sure you have a segment open
Traceback (most recent call last):
  File "/home/ubuntu/gss-gfe-ingest/app/cli.py", line 109, in main
    download_geotiffs(key_prefix, config.WORKING_DIR, variable)
  File "/home/ubuntu/.venv/local/lib/python2.7/site-packages/aws_xray_sdk/core/recorder.py", line 307, in wrapper
    meta_processor=None,
  File "/home/ubuntu/.venv/local/lib/python2.7/site-packages/aws_xray_sdk/core/recorder.py", line 322, in record_subsegment
    return_value = wrapped(*args, **kwargs)
  File "/home/ubuntu/.venv/local/lib/python2.7/site-packages/datadog/dogstatsd/context.py", line 53, in wrapped
    return func(*args, **kwargs)
  File "/home/ubuntu/gss-gfe-ingest/app/cli.py", line 205, in download_geotiffs
    bucket.download_file(obj.key, filename)
  File "/home/ubuntu/.venv/local/lib/python2.7/site-packages/boto3/s3/inject.py", line 168, in bucket_download_file
    ExtraArgs=ExtraArgs, Callback=Callback, Config=Config)
  File "/home/ubuntu/.venv/local/lib/python2.7/site-packages/boto3/s3/inject.py", line 130, in download_file
    extra_args=ExtraArgs, callback=Callback)
  File "/home/ubuntu/.venv/local/lib/python2.7/site-packages/boto3/s3/transfer.py", line 299, in download_file
    future.result()
  File "/home/ubuntu/.venv/local/lib/python2.7/site-packages/s3transfer/futures.py", line 73, in result
    return self._coordinator.result()
  File "/home/ubuntu/.venv/local/lib/python2.7/site-packages/s3transfer/futures.py", line 233, in result
    raise self._exception
AttributeError: 'NoneType' object has no attribute 'put_http_meta'

However, using

...
 # Ensure that no threads are used.
xfer_config = TransferConfig(use_threads=False)
bucket.download_file(obj.key, filename, Config=xfer_config)
...

corrects the exception, but is not an acceptable work around for performance reasons. Any insight on how to work around this issue with boto3 and downloading files using it's default threaded transfer manager? I see @haotianw465 mentions "this manager also has to be patched so that whenever it creates a new thread it injects thread local storage to the new thread". How do we go about this?

from aws-xray-sdk-python.

kstromeiraos avatar kstromeiraos commented on July 18, 2024 3

Experiencing this problem at the moment... Has anyone found any workaround to go through this?

from aws-xray-sdk-python.

haotianw465 avatar haotianw465 commented on July 18, 2024 1

From reading the S3 resource API bucket.download_file(obj.key, filename) there is no proper interface to pass callback or custom ThreadExecutor. Since this API is very high level, the actual logic of invoking thread pool hides really deep.

We will find some time look deeper on this issue but we don't have capacity right now to provide a fully working solution.

I suggest to start from monkey patching https://github.com/boto/boto3/blob/develop/boto3/s3/transfer.py#L142 create_transfer_manager. In this function you can see when use_threads = False it uses NonThreadedExecutor which only uses the main thread. This is the reason the solution from @ShibaBandit works.

By default this library boto3 -> s3transfer uses python built-in futures.ThreadPoolExecutor as you can see here: https://github.com/boto/s3transfer/blob/develop/s3transfer/futures.py#L370.

If you monkey patch the create_transfer_manager so that it passes your custom ThreadPoolExecutor, and the executor injects the segment to its children threads, the problem should go away. Another caveats is that when doing thread injection, thread local needs to be cleaned up when work done to avoid thread pollution, since threads in a thread pool will be re-used.

Ideally we could have provide a single line opt-in patch and everything automatically works so you don't have to care about the underlying details. But the challenge of context binding for the SDK is that every single library utilizes ThreadPool needs to be configured (or worse case like this requires monkey patching) separately.

from aws-xray-sdk-python.

haotianw465 avatar haotianw465 commented on July 18, 2024 1

Another bug revealed by @ShibaBandit 's stack trace is here https://github.com/aws/aws-xray-sdk-python/blob/master/aws_xray_sdk/core/recorder.py#L312. record_subsegment should never throw error because of a None subsegment resulted from missing context. We will fix this in the next release so that your application will not break when set to LOG_ERROR. But the subsegments on S3 calls are still going to be dropped without a custom ThreadPoolExecutor.

from aws-xray-sdk-python.

haotianw465 avatar haotianw465 commented on July 18, 2024 1

For boto it actually refers to boto2 which is relatively old considering boto3 has been stable for 2 years http://boto.readthedocs.io/en/latest/. So the SDK supports only boto3 on day-1 release.

X-Ray SDK has stronger semantic requirements on data format. It assumes an AWS call is a subtask from a node in the call chain thus always requires a parent. By X-Ray definition subsegments recording outbound http requests, sql queries and AWS calls cannot exist independently and it must be part of a logical task (processing an event, serving an incoming request etc).

This topic has some overlap with opentracing discussion regarding backend data format. Please feel free to share your thoughts here https://forums.aws.amazon.com/thread.jspa?threadID=272498&tstart=0. This is a more general discussion which will have an impact to all of our SDKs. You are also welcome to open a new issue here about opentracing to discuss more specific questions.

from aws-xray-sdk-python.

haotianw465 avatar haotianw465 commented on July 18, 2024 1

The bug is fixed as part of 0.96 release so that the exception doesn't pop up when your context_missing is set to LOG_ERROR.

from aws-xray-sdk-python.

haotianw465 avatar haotianw465 commented on July 18, 2024

A segment represents a full request/response cycle and all the local work done will be recorded as subsegments and attached to this segment. SegmentNotFound means the SDK cannot correlate the S3 upload to a request for your Django app.

The library django-storages you are using utilizes boto3's S3 multi-part upload to increase throughput. And this multi-part upload functionality uses a thread pool. You can see the code here. https://github.com/boto/s3transfer/blob/develop/s3transfer/manager.py

This is a high-level API which internally uses multiple threads to upload to S3. When it creates a new thread and use that thread to make a call to S3, the SDK is still trying to create a subsegment which records useful information of this downstream call, but it lost the context due to in the new thread the local storage is empty.

In order to work with S3 transfer manager, this manager also has to be patched so that whenever it creates a new thread it injects thread local storage to the new thread. More general, the SDK today is incompatible with any library that manages its own thread pool.

Sorry for the confusion and we will update the doc for these special cases and looking into support them later.

from aws-xray-sdk-python.

haotianw465 avatar haotianw465 commented on July 18, 2024

Feel free to re-open if further assistant needed.

from aws-xray-sdk-python.

thehesiod avatar thehesiod commented on July 18, 2024

may want to follow what's done in ddtrace, haven't heard of issues yet: https://github.com/DataDog/dd-trace-py/blob/master/ddtrace/contrib/botocore/patch.py, and https://github.com/DataDog/dd-trace-py/blob/master/ddtrace/contrib/boto/patch.py. There's also this opentracing movement, haven't looked too deeply into it yet.

from aws-xray-sdk-python.

ShibaBandit avatar ShibaBandit commented on July 18, 2024

Thank you @haotianw465 .

from aws-xray-sdk-python.

haotianw465 avatar haotianw465 commented on July 18, 2024

Close in favor of #52.

from aws-xray-sdk-python.

lupcotrpeski avatar lupcotrpeski commented on July 18, 2024

I don't believe this issue is resolved. I'm having the same issue.

from aws-xray-sdk-python.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.