Comments (6)
If downgrade prefect-aws to 0.2.4 all work ok.
from prefect-aws.
Update: I find another interesting bug, maybe related to this one.
If I install prefect by the following command:
pipenv install prefect prefect-aws
prefect orion start
And then I start a new orion instance. The AWS Credentials
creation form will not have AWS Client Parameters
. But if I install the prefect, start an orion, stop it, and install prefect-aws after that:
pipenv install prefect
prefect orion start
Ctrl-C
pipenv install prefect-aws
prefect orion start
The form will have these fields:
I checked the SQLite database, there are duplicated AwsCredentials
items in block_schema
table, and they share the same block_type_id
. So, when UI queries the filter API to get form fields, the first one returned to the JSON array will be used.
from prefect-aws.
From pydantic doc:
By default, as explained here, pydantic tries to validate (and coerce if it can) in the order of the Union. So sometimes you may have unexpected coerced data.
If change order in S3Bucket.credentials type All work:
credentials: Optional[Union[MinIOCredentials, AwsCredentials]] = Field(
default=None, description="A block containing your credentials to AWS or MinIO."
)
Or add class setting smart_union = True in class Config for S3Bucket
from prefect-aws.
see #251
from prefect-aws.
Error in prefect-aws==0.3.2:
Register blocks:
@sync_compatible
async def save_conf_blocks():
endpoint_url = ENDPOINT_URL
minio = MinIOCredentials.parse_obj(
dict(name='epool-tk-upd-minio', minio_root_user=environ['MINIO_LOGIN'],
minio_root_password=environ['MINIO_PASSW'],
aws_client_parameters=dict(endpoint_url=endpoint_url), ))
await minio.save('epool-tk-upd-minio', overwrite=True)
s3_bucket_block = S3Bucket(
name='epool-tk-upd', bucket_name='tk-upd-cache',
credentials=minio, # minio_credentials=minio,
endpoint_url=endpoint_url, )
await s3_bucket_block.save('epool-tk-upd', overwrite=True)
Flow code:
@task(cache_result_in_memory=False, persist_result=False, log_prints=True)
async def save_result(cache: WritableFileSystem, path, data):
assert (
isinstance(cache, LocalFileSystem) or
isinstance(cache.credentials, MinIOCredentials)
), f'{cache}: {cache.credentials}'
await cache.write_path(path, out_data)
@sync_compatible
async def s3_bucket_block() -> S3Bucket:
_s3_bucket_block = await S3Bucket.load('epool-tk-upd')
return _s3_bucket_block
@flow
def tk_upd_cache():
cache=s3_bucket_block()
save_result.submit(cache, 'tk_name', b'opa')
Error out:
Encountered exception during execution:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/prefect/engine.py", line 1623, in orchestrate_task_run
result = await call.aresult()
^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 181, in aresult
return await asyncio.wrap_future(self.future)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/prefect/_internal/concurrency/calls.py", line 218, in _run_async
result = await coro
^^^^^^^^^^
File "/prj/flows/tools.py", line 179, in save_result
assert (
AssertionError: S3Bucket(bucket_name='tk-upd-cache', credentials=AwsCredentials(aws_access_key_id=None, aws_secret_access_key=None, aws_session_token=None, profile_name=None, region_name='ru-central1', aws_client_parameters=AwsClientParameters(api_version=None, use_ssl=True, verify=True, verify_cert_path=None, endpoint_url='https://s3.epool.ru', config=None)), bucket_folder=''): AwsCredentials(aws_access_key_id=None, aws_secret_access_key=None, aws_session_token=None, profile_name=None, region_name='ru-central1', aws_client_parameters=AwsClientParameters(api_version=None, use_ssl=True, verify=True, verify_cert_path=None, endpoint_url='https://s3.epool.ru', config=None))
01:43:44 PM
save_result-0
prefect.task_runs
from prefect-aws.
see #251
I'm still getting this error on version prefect-aws==0.3.2.
Its the same issue that @KernelErr reported, when i make the change he proposed it works fine but without it S3 loads the wrong config.
I also noted that if I define the S3 block inside a Flow, using minio credentials block, it works fine but when getting the s3block block directly it throws an error.
from prefect-aws.
Related Issues (20)
- push_to_s3 and pull_from_s3 do not support additional AwsCredentials block parameters
- AWS S3 copy and move tasks and `S3Bucket` methods
- Prefect Network Configuration defined in work pool fails to get applied in some scenarios when using with ECS Fargate Task HOT 2
- AwsCredentials import raising PydanticUserError HOT 6
- boto3 client use_ssl wrong default
- put_directory creates a new s3_client for each file uploaded HOT 2
- v0.4.4 results in the prefect task to be given the masked PREFECT_API_KEY
- allow define only aws region name through AwsCredentials HOT 1
- Add a Lambda function block HOT 1
- Accept `None` as an argument to Launch Type for ECSTask with `publish_as_work_pool` HOT 1
- Task definition caching does not work if the task definitions come from separate deployments. HOT 2
- Error in version 0.4.8: TypeError: unhashable type: 'dict' HOT 3
- Credential use examples are in correct in documentation HOT 6
- ECS Works Pools Should Support Specifying Volumes for Flow Runs HOT 1
- `S3Bucket.copy_object`: Only resolve target path with self if `to_bucket` is not defined
- ECSTask block `publish_as_work_pool` does not set `network_configuration` HOT 2
- Improvement to ECS worker setup guide
- ECS worker: Updating family setting to template flow name/deployment
- Change logging prefix to avoid unnecessary task definition registrations
- Add Python 3.12 to test matrix
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from prefect-aws.