Comments (10)
Package | Assignee | Example Dag Tested post fixing mypy errors? | Logs attached ? | Reviewer |
---|---|---|---|---|
amazon | @sunank200 | Yes | Yes | @bharanidharan14 |
cncf | @pankajastro | Yes | Yes | @phanikumv |
core | @bharanidharan14 | Yes | Yes | |
databricks | @phanikumv | Yes | Yes | @sunank200 |
@pankajastro , @rajaths010494 | @bharanidharan14 | |||
http | @sunank200 | Yes | Yes | @phanikumv |
snowflake | @bharanidharan14 | Yes | Yes | @rajaths010494 |
from astronomer-providers.
astronomer_providers/astronomer/providers/http/__init__.py: note: In function "__getattr__":
astronomer_providers/astronomer/providers/http/__init__.py:12: error: Function is missing a type annotation [no-untyped-def]
def __getattr__(name):
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/s3.py: note: In class "S3HookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/s3.py:8: error: Class cannot subclass "AwsBaseHookAsync" (has type "Any") [misc]
class S3HookAsync(AwsBaseHookAsync):
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/s3.py: note: In member "__init__" of class "S3HookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/s3.py:16: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(self, *args, **kwargs) -> None:
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py: note: In member "__init__" of class "S3KeyTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:15: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:23: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py: note: In member "_check_exact_key" of class "S3KeyTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:31: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
async def _check_exact_key(client, bucket, key) -> bool:
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py: note: In member "_check_wildcard_key" of class "S3KeyTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:49: error: Function is missing a type annotation [no-untyped-def]
async def _check_wildcard_key(client, bucket, wildcard_key):
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py: note: In member "_check_key" of class "S3KeyTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:68: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
async def _check_key(self, client, bucket, key, wildcard) -> bool:
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:78: error: Returning Any from function declared to return "bool" [no-any-return]
return await self._check_wildcard_key(client, bucket, key)
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:78: error: Call to untyped function "_check_wildcard_key" of "S3KeyTrigger" in typed context [no-untyped-call]
return await self._check_wildcard_key(client, bucket, key)
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py: note: In member "run" of class "S3KeyTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:97: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py: note: In member "__init__" of class "BigQueryInsertJobTrigger":
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py:21: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py: note: In member "run" of class "BigQueryInsertJobTrigger":
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py:46: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py: note: In member "run" of class "BigQueryCheckTrigger":
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py:99: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/core/triggers/filesystem.py: note: In member "__init__" of class "FileTrigger":
astronomer_providers/astronomer/providers/core/triggers/filesystem.py:31: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/core/triggers/filesystem.py: note: In member "run" of class "FileTrigger":
astronomer_providers/astronomer/providers/core/triggers/filesystem.py:49: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/core/triggers/filesystem.py:57: error: Incompatible types in assignment (expression has type "str", variable has type "float") [assignment]
mod_time = datetime.datetime.fromtimestamp(mod_time).strftime("%Y%m%d%H%M%S")
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py: note: In class "RedshiftHookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py:12: error: Class cannot subclass "AwsBaseHookAsync" (has type "Any") [misc]
class RedshiftHookAsync(AwsBaseHookAsync):
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py: note: In member "__init__" of class "RedshiftHookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py:17: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(self, *args, **kwargs) -> None:
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py: note: In member "get_cluster_status" of class "RedshiftHookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py:87: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
async def get_cluster_status(self, cluster_identifier, expected_state, flag) -> Dict[str, Any]:
^
astronomer_providers/astronomer/providers/google/cloud/triggers/gcs.py: note: In member "__init__" of class "GCSBlobTrigger":
astronomer_providers/astronomer/providers/google/cloud/triggers/gcs.py:29: error: Missing type parameters for generic type "dict" [type-arg]
hook_params: dict,
^
astronomer_providers/astronomer/providers/google/cloud/triggers/gcs.py:31: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/google/cloud/triggers/gcs.py: note: In member "run" of class "GCSBlobTrigger":
astronomer_providers/astronomer/providers/google/cloud/triggers/gcs.py:53: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py: note: In function "get_db_hook":
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py:10: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def get_db_hook(self) -> SnowflakeHookAsync:
^
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py: note: In member "__init__" of class "SnowflakeTrigger":
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py:29: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py: note: In member "run" of class "SnowflakeTrigger":
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py:49: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/snowflake/example_dags/example_snowflake.py:45: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
start_date=days_ago(0),
^
astronomer_providers/astronomer/providers/http/triggers/http.py:4: error: Module "airflow" does not explicitly export attribute "AirflowException"; implicit reexport disabled [attr-defined]
from airflow import AirflowException
^
astronomer_providers/astronomer/providers/http/triggers/http.py: note: In member "__init__" of class "HttpTrigger":
astronomer_providers/astronomer/providers/http/triggers/http.py:46: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/http/triggers/http.py: note: In member "run" of class "HttpTrigger":
astronomer_providers/astronomer/providers/http/triggers/http.py:71: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/http/hooks/http.py: note: In member "__init__" of class "HttpHookAsync":
astronomer_providers/astronomer/providers/http/hooks/http.py:38: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/http/hooks/http.py: note: In member "_retryable_error_async" of class "HttpHookAsync":
astronomer_providers/astronomer/providers/http/hooks/http.py:143: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def _retryable_error_async(self, exception) -> bool:
^
astronomer_providers/astronomer/providers/http/hooks/http.py:155: error: Returning Any from function declared to return "bool" [no-any-return]
return exception.status >= 500
^
astronomer_providers/astronomer/providers/http/example_dags/example_http.py:6: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
with DAG("example_async_http_sensor", tags=["example", "async"], start_date=days_ago(2)) as dag:
^
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py: note: In member "__init__" of class "GoogleBaseHookAsync":
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py:10: error: Function is missing a type annotation [no-untyped-def]
def __init__(self, **kwargs):
^
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py: note: In member "get_sync_hook" of class "GoogleBaseHookAsync":
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py:14: error: Function is missing a return type annotation [no-untyped-def]
async def get_sync_hook(self):
^
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py: note: In member "service_file_as_context" of class "GoogleBaseHookAsync":
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py:23: error: Function is missing a return type annotation [no-untyped-def]
async def service_file_as_context(self):
^
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py:24: error: Call to untyped function "get_sync_hook" in typed context [no-untyped-call]
sync_hook = await self.get_sync_hook()
^
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py: note: In member "__init__" of class "GCSObjectExistenceSensorAsync":
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py:55: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py: note: In member "execute" of class "GCSObjectExistenceSensorAsync":
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py:75: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py: note: In member "execute_complete" of class "GCSObjectExistenceSensorAsync":
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py:88: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/databricks/triggers/databricks.py:4: error: Module "airflow" does not explicitly export attribute "AirflowException"; implicit reexport disabled [attr-defined]
from airflow import AirflowException
^
astronomer_providers/astronomer/providers/databricks/triggers/databricks.py: note: In member "__init__" of class "DatabricksTrigger":
astronomer_providers/astronomer/providers/databricks/triggers/databricks.py:20: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/databricks/triggers/databricks.py: note: In member "run" of class "DatabricksTrigger":
astronomer_providers/astronomer/providers/databricks/triggers/databricks.py:44: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/databricks/example_dags/example_databricks.py:24: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
start_date=days_ago(0),
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py:5: error: Module "airflow.models" does not explicitly export attribute "DagRun"; implicit reexport disabled [attr-defined]
from airflow.models import DagRun, TaskInstance
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py:5: error: Module "airflow.models" does not explicitly export attribute "TaskInstance"; implicit reexport disabled [attr-defined]
from airflow.models import DagRun, TaskInstance
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "__init__" of class "TaskStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:21: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "run" of class "TaskStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:43: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "count_tasks" of class "TaskStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:57: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def count_tasks(self, session) -> int:
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py:71: error: Returning Any from function declared to return "int" [no-any-return]
return count
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "__init__" of class "DagStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:82: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "run" of class "DagStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:102: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "count_dags" of class "DagStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:117: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def count_dags(self, session) -> int:
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py:130: error: Returning Any from function declared to return "int" [no-any-return]
return count
^
astronomer_providers/astronomer/providers/core/sensors/external_task.py: note: In member "execute" of class "ExternalTaskSensorAsync":
astronomer_providers/astronomer/providers/core/sensors/external_task.py:12: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/core/sensors/external_task.py:17: error: Call to untyped function "get_execution_dates" in typed context [no-untyped-call]
execution_dates = self.get_execution_dates(context)
^
astronomer_providers/astronomer/providers/core/sensors/external_task.py: note: In member "execute_complete" of class "ExternalTaskSensorAsync":
astronomer_providers/astronomer/providers/core/sensors/external_task.py:47: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, session, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/core/sensors/external_task.py:52: error: Call to untyped function "get_execution_dates" in typed context [no-untyped-call]
execution_dates = self.get_execution_dates(context)
^
astronomer_providers/astronomer/providers/core/sensors/external_task.py: note: In member "get_execution_dates" of class "ExternalTaskSensorAsync":
astronomer_providers/astronomer/providers/core/sensors/external_task.py:63: error: Function is missing a type annotation [no-untyped-def]
def get_execution_dates(self, context):
^
astronomer_providers/astronomer/providers/core/example_dags/example_file_sensor.py:6: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
with airflow.DAG("example_async_file_sensor", start_date=days_ago(1), tags=["async"]) as dag:
^
astronomer_providers/astronomer/providers/core/example_dags/example_external_task.py:18: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
default_args = {"start_date": days_ago(0)}
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py: note: In member "__init__" of class "RedshiftClusterTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py:19: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py: note: In member "run" of class "RedshiftClusterTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py:41: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py: note: In member "__init__" of class "RedshiftClusterSensorTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py:80: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py: note: In member "run" of class "RedshiftClusterSensorTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py:102: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/amazon/aws/example_dags/example_s3.py:16: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
start_date=days_ago(3),
^
astronomer_providers/astronomer/providers/amazon/aws/example_dags/example_redshift_cluster_management.py:19: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
start_date=days_ago(1),
^
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py: note: In member "get_run_state_async" of class "DatabricksHookAsync":
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py:24: error: Call to untyped function "_do_api_call_async" in typed context [no-untyped-call]
response = await self._do_api_call_async(GET_RUN_ENDPOINT, json)
^
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py: note: In member "_do_api_call_async" of class "DatabricksHookAsync":
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py:34: error: Function is missing a type annotation [no-untyped-def]
async def _do_api_call_async(self, endpoint_info, json):
^
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py:100: error: Argument 2 to "_log_request_error" of "DatabricksHook" has incompatible type "ClientResponseError"; expected "str"
[arg-type]
self._log_request_error(attempt_num, e)
^
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py: note: In member "_retryable_error_async" of class "DatabricksHookAsync":
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py:112: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def _retryable_error_async(self, exception) -> bool:
^
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py:126: error: Returning Any from function declared to return "bool" [no-any-return]
return exception.status >= 500
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py: note: In member "__init__" of class "WaitContainerTrigger":
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:49: error: Missing type parameters for generic type "dict" [type-arg]
hook_params: Optional[dict] = None,
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py: note: In member "wait_for_pod_start" of class "WaitContainerTrigger":
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:81: error: Function is missing a return type annotation [no-untyped-def]
async def wait_for_pod_start(self, v1_api: CoreV1Api):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py: note: In member "wait_for_container_completion" of class "WaitContainerTrigger":
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:95: error: Function is missing a return type annotation [no-untyped-def]
async def wait_for_container_completion(self, v1_api: CoreV1Api):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:99: error: Argument "container_name" to "container_is_running" has incompatible type "Optional[str]"; expected "str"
[arg-type]
if not container_is_running(pod=pod, container_name=self.container_name):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py: note: In member "run" of class "WaitContainerTrigger":
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:103: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py: note: In member "_format_exception_description" of class "WaitContainerTrigger":
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:123: error: Function is missing a return type annotation [no-untyped-def]
def _format_exception_description(self, exc: Exception):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:2: error: Module "airflow" does not explicitly export attribute "AirflowException"; implicit reexport disabled
[attr-defined]
from airflow import AirflowException
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py: note: In member "_load_config" of class "KubernetesHookAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:8: error: Function is missing a return type annotation [no-untyped-def]
async def _load_config(self):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:20: error: Call to untyped function "_coalesce_param" of "KubernetesHook" in typed context [no-untyped-call]
in_cluster = self._coalesce_param(
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:23: error: Call to untyped function "_coalesce_param" of "KubernetesHook" in typed context [no-untyped-call]
cluster_context = self._coalesce_param(
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:26: error: Call to untyped function "_coalesce_param" of "KubernetesHook" in typed context [no-untyped-call]
kubeconfig_path = self._coalesce_param(
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:53: error: Module has no attribute "tempfile" [attr-defined]
async with aiofiles.tempfile.NamedTemporaryFile() as temp_config:
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py: note: In member "get_api_client_async" of class "KubernetesHookAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:70: error: Function is missing a return type annotation [no-untyped-def]
async def get_api_client_async(self):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:71: error: Call to untyped function "_load_config" in typed context [no-untyped-call]
await self._load_config()
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/base_aws_async.py: note: In member "get_client_async" of class "AwsBaseHookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/base_aws_async.py:15: error: Function is missing a return type annotation [no-untyped-def]
async def get_client_async(self):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py: note: In member "run" of class "SnowflakeHookAsync":
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: error: Function is missing a return type annotation [no-untyped-def]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: error: Signature of "run" incompatible with supertype "SnowflakeHook" [override]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: Superclass:
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Union[Dict[Any, Any], Iterable[Any], None] = ..., handler: Optional[Callable[..., Any]] = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: Subclass:
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Optional[Dict[Any, Any]] = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Union[Dict[Any, Any], Iterable[Any], None] = ..., handler: Optional[Callable[..., Any]] = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Optional[Dict[Any, Any]] = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: error: Signature of "run" incompatible with supertype "DbApiHook" [override]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Any, autocommit: Any = ..., parameters: Any = ..., handler: Any = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Optional[Dict[Any, Any]] = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: error: Missing type parameters for generic type "list" [type-arg]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: error: Missing type parameters for generic type "dict" [type-arg]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:56: error: Call to untyped function "get_autocommit" in typed context [no-untyped-call]
if not self.get_autocommit(conn):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py: note: In member "check_query_output" of class "SnowflakeHookAsync":
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:60: error: Function is missing a return type annotation [no-untyped-def]
def check_query_output(self, query_ids: List[str]):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py: note: In member "get_query_status" of class "SnowflakeHookAsync":
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:72: error: Function is missing a return type annotation [no-untyped-def]
async def get_query_status(self, query_ids: List[str]):
^
astronomer_providers/astronomer/providers/http/operators/http.py: note: In member "execute" of class "HttpSensorAsync":
astronomer_providers/astronomer/providers/http/operators/http.py:7: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/http/operators/http.py: note: In member "execute_complete" of class "HttpSensorAsync":
astronomer_providers/astronomer/providers/http/operators/http.py:30: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/databricks/operators/databricks.py: note: In member "execute" of class "DatabricksSubmitRunOperatorAsync":
astronomer_providers/astronomer/providers/databricks/operators/databricks.py:12: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/databricks/operators/databricks.py: note: In member "execute_complete" of class "DatabricksSubmitRunOperatorAsync":
astronomer_providers/astronomer/providers/databricks/operators/databricks.py:47: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/databricks/operators/databricks.py: note: In member "execute" of class "DatabricksRunNowOperatorAsync":
astronomer_providers/astronomer/providers/databricks/operators/databricks.py:58: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/databricks/operators/databricks.py: note: In member "execute_complete" of class "DatabricksRunNowOperatorAsync":
astronomer_providers/astronomer/providers/databricks/operators/databricks.py:92: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/core/sensors/filesystem.py: note: In member "execute" of class "FileSensorAsync":
astronomer_providers/astronomer/providers/core/sensors/filesystem.py:29: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/core/sensors/filesystem.py:30: error: Call to untyped function "poke" in typed context [no-untyped-call]
if not self.poke(context=context):
^
astronomer_providers/astronomer/providers/core/sensors/filesystem.py:31: error: Call to untyped function "FSHook" in typed context [no-untyped-call]
hook = FSHook(self.fs_conn_id)
^
astronomer_providers/astronomer/providers/core/sensors/filesystem.py: note: In member "execute_complete" of class "FileSensorAsync":
astronomer_providers/astronomer/providers/core/sensors/filesystem.py:46: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/core/example_dags/example_external_task_wait_for_me.py:9: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
default_args = {"start_date": days_ago(0)}
^
astronomer_providers/astronomer/providers/core/example_dags/example_external_task_wait_for_me.py:15: error: Call to untyped function "TimeSensorAsync" in typed context [no-untyped-call]
wait_for_me = TimeSensorAsync(
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:1: error: Module "airflow" does not explicitly export attribute "AirflowException"; implicit reexport disabled
[attr-defined]
from airflow import AirflowException
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py: note: In class "PodNotFoundException":
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:12: error: Class cannot subclass "AirflowException" (has type "Any") [misc]
class PodNotFoundException(AirflowException):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py: note: In member "__init__" of class "KubernetesPodOperatorAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:23: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(self, *, poll_interval: int = 5, **kwargs):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py: note: In member "raise_for_trigger_status" of class "KubernetesPodOperatorAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:28: error: Function is missing a type annotation [no-untyped-def]
def raise_for_trigger_status(event):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py: note: In member "execute" of class "KubernetesPodOperatorAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:37: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:38: error: Call to untyped function "build_pod_request_obj" in typed context [no-untyped-call]
self.pod_request_obj = self.build_pod_request_obj(context)
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py: note: In member "execute_complete" of class "KubernetesPodOperatorAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:57: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:60: error: Call to untyped function "build_pod_request_obj" in typed context [no-untyped-call]
self.pod_request_obj = self.build_pod_request_obj(context)
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:67: error: Call to untyped function "raise_for_trigger_status" of "KubernetesPodOperatorAsync" in typed context
[no-untyped-call]
self.raise_for_trigger_status(event)
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:78: error: Call to untyped function "extract_xcom" in typed context [no-untyped-call]
result = self.extract_xcom(pod=self.pod)
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:6: error: Module "airflow.models" does not explicitly export attribute "BaseOperator"; implicit reexport disabled [attr-defined]
from airflow.models import BaseOperator
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: In class "S3KeySensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:14: error: Class cannot subclass "BaseOperator" (has type "Any") [misc]
class S3KeySensorAsync(BaseOperator):
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: In member "__init__" of class "S3KeySensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:45: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: In member "_resolve_bucket_and_key" of class "S3KeySensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:64: error: Function is missing a return type annotation [no-untyped-def]
def _resolve_bucket_and_key(self):
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:64: note: Use "-> None" if function does not return a value
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: In member "execute" of class "S3KeySensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:73: error: Missing type parameters for generic type "Dict" [type-arg]
def execute(self, context: Dict) -> Any:
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:74: error: Call to untyped function "_resolve_bucket_and_key" in typed context [no-untyped-call]
self._resolve_bucket_and_key()
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: At top level:
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:78: error: Unused "type: ignore" comment
bucket_name=self.bucket_name, # type: ignore
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: In member "execute_complete" of class "S3KeySensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:87: error: Function is missing a return type annotation [no-untyped-def]
def execute_complete(self, context: Dict, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:87: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def execute_complete(self, context: Dict, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:87: error: Missing type parameters for generic type "Dict" [type-arg]
def execute_complete(self, context: Dict, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py: note: In member "__init__" of class "SnowflakeOperatorAsync":
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py:12: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(self, *, poll_interval: int = 5, **kwargs):
^
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py: note: In member "execute" of class "SnowflakeOperatorAsync":
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py:20: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py: note: In member "execute_complete" of class "SnowflakeOperatorAsync":
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py:44: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py: note: In member "__init__" of class "RedshiftClusterSensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:25: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py: note: In member "execute" of class "RedshiftClusterSensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:34: error: Function is missing a return type annotation [no-untyped-def]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:34: error: Argument 1 of "execute" is incompatible with supertype "BaseSensorOperator"; supertype defines the argument
type as "Dict[Any, Any]" [override]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:34: note: This violates the Liskov substitution principle
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:34: note: See https://mypy.readthedocs.io/en/stable/common_issues.html#incompatible-overrides
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py: note: In member "execute_complete" of class "RedshiftClusterSensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:47: error: Function is missing a return type annotation [no-untyped-def]
def execute_complete(self, context: "Context", event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:47: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def execute_complete(self, context: "Context", event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "__init__" of class "RedshiftResumeClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:26: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "execute" of class "RedshiftResumeClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:35: error: Function is missing a return type annotation [no-untyped-def]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "execute_complete" of class "RedshiftResumeClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:55: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None):
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "__init__" of class "RedshiftPauseClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:82: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "execute" of class "RedshiftPauseClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:91: error: Function is missing a return type annotation [no-untyped-def]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "execute_complete" of class "RedshiftPauseClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:111: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:24: error: Module "airflow.models" does not explicitly export attribute "BaseOperator"; implicit reexport disabled [attr-defined]
from airflow.models import BaseOperator
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py: note: In class "BigQueryInsertJobOperatorAsync":
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:48: error: Class cannot subclass "BaseOperator" (has type "Any") [misc]
class BigQueryInsertJobOperatorAsync(BigQueryInsertJobOperator, BaseOperator):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py: note: In member "execute" of class "BigQueryInsertJobOperatorAsync":
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:113: error: Function is missing a return type annotation [no-untyped-def]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:122: error: Call to untyped function "_job_id" in typed context [no-untyped-call]
job_id = self._job_id(context)
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py: note: In member "execute_complete" of class "BigQueryInsertJobOperatorAsync":
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:158: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py: note: In member "execute" of class "BigQueryCheckOperatorAsync":
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: error: Function is missing a return type annotation [no-untyped-def]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: error: Signature of "execute" incompatible with supertype "SQLCheckOperator" [override]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: note: Superclass:
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: note: def execute(self, context: Any = ...) -> Any
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: note: Subclass:
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: note: def execute(self, context: Context) -> Any
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py: note: In member "execute_complete" of class "BigQueryCheckOperatorAsync":
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:202: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None):
^
astronomer_providers/astronomer/providers/google/cloud/example_dags/example_gcs.py:44: error: Module has no attribute "DAG" [attr-defined]
with models.DAG(
^
astronomer_providers/astronomer/providers/google/cloud/example_dags/example_bigquery_queries.py:62: error: Module has no attribute "DAG" [attr-defined]
with models.DAG(
^
astronomer_providers/astronomer/providers/google/cloud/hooks/gcs.py: note: In class "GCSHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/gcs.py:29: error: Class cannot subclass "GoogleBaseHookAsync" (has type "Any") [misc]
class GCSHookAsync(GoogleBaseHookAsync):
^
astronomer_providers/astronomer/providers/google/cloud/hooks/gcs.py: note: In member "get_storage_client" of class "GCSHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/gcs.py:32: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
async def get_storage_client(self, session) -> Storage:
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "insert_job" of class "_BigQueryHook":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:40: error: Missing type parameters for generic type "Dict" [type-arg]
configuration: Dict,
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In class "BigQueryHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:98: error: Class cannot subclass "GoogleBaseHookAsync" (has type "Any") [misc]
class BigQueryHookAsync(GoogleBaseHookAsync):
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "get_job_instance" of class "BigQueryHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:101: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
async def get_job_instance(self, project_id, job_id, session) -> Job:
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "get_job_status" of class "BigQueryHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:106: error: Function is missing a return type annotation [no-untyped-def]
async def get_job_status(
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:118: error: Argument 1 to "result" of "Job" has incompatible type "ClientSession"; expected "Optional[Session]" [arg-type]
job_status_response = await job_client.result(s)
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "get_job_output" of class "BigQueryHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:132: error: Missing type parameters for generic type "Dict" [type-arg]
) -> Dict:
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:140: error: Argument 1 to "get_query_results" of "Job" has incompatible type "ClientSession"; expected "Optional[Session]" [arg-type]
job_query_response = await job_client.get_query_results(session)
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "get_records" of class "BigQueryHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:143: error: Missing type parameters for generic type "Dict" [type-arg]
def get_records(self, query_results: Dict, nocast: bool = True) -> list:
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:143: error: Missing type parameters for generic type "list" [type-arg]
def get_records(self, query_results: Dict, nocast: bool = True) -> list:
^
Found 163 errors in 42 files (checked 83 source files)
make: *** [run-mypy] Error 1
(venv) phanikv@Phanis-MBP astronomer-providers % make run-mypy > mypy-errors.txt
Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them
WARNING: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested
make: *** [run-mypy] Error 1
(venv) phanikv@Phanis-MBP astronomer-providers % ls -lrt
total 368
-rw-r--r-- 1 phanikv staff 25 Feb 24 13:03 CHANGELOG.rst
-rw-r--r-- 1 phanikv staff 5222 Feb 24 13:03 CODE_OF_CONDUCT.md
-rw-r--r-- 1 phanikv staff 8114 Feb 24 13:03 CONTRIBUTING.rst
-rw-r--r-- 1 phanikv staff 11358 Feb 24 13:03 LICENSE
-rw-r--r-- 1 phanikv staff 2285 Feb 24 13:03 README.rst
-rw-r--r-- 1 phanikv staff 587 Feb 24 13:03 SECURITY.md
drwxr-xr-x 2 phanikv staff 64 Feb 24 13:03 __pycache__
drwxr-xr-x 4 phanikv staff 128 Feb 24 13:03 astronomer
drwxr-xr-x 8 phanikv staff 256 Feb 24 13:03 astronomer_operators.egg-info
drwxr-xr-x 9 phanikv staff 288 Feb 24 13:03 dev
drwxr-xr-x@ 73 phanikv staff 2336 Feb 24 13:03 htmlcov
-rw-r--r-- 1 phanikv staff 181 Feb 24 13:03 pyproject.toml
drwxr-xr-x 14 phanikv staff 448 Feb 24 13:03 tests
drwxr-xr-x 6 phanikv staff 192 Feb 24 13:03 venv
-rw-r--r-- 1 phanikv staff 102 Feb 24 13:03 yamllint-config.yml
-rw-r--r-- 1 phanikv staff 1548 Feb 24 13:07 Makefile
-rw-r--r-- 1 phanikv staff 3066 Feb 24 13:07 setup.cfg
-rw-r--r-- 1 phanikv staff 72110 Feb 24 13:44 mypy-errors.txt
(venv) phanikv@Phanis-MBP astronomer-providers % make run-mypy > mypy-errors.txt
^Cmake: *** [run-mypy] Error 255
(venv) phanikv@Phanis-MBP astronomer-providers % clear
(venv) phanikv@Phanis-MBP astronomer-providers % make run-mypy
docker build -f dev/Dockerfile . -t astronomer-operators-dev
Sending build context to Docker daemon 617.7MB
Step 1/8 : ARG IMAGE_NAME="quay.io/astronomer/astro-runtime:4.1.0-base"
Step 2/8 : FROM ${IMAGE_NAME}
---> b49b240789d9
Step 3/8 : USER root
---> Using cache
---> 3a8e7e2245ca
Step 4/8 : COPY astronomer/ ${AIRFLOW_HOME}/astronomer_providers/astronomer/
---> Using cache
---> 3ac3c0399644
Step 5/8 : COPY setup.cfg ${AIRFLOW_HOME}/astronomer_providers/setup.cfg
---> Using cache
---> 817aaafdd417
Step 6/8 : COPY pyproject.toml ${AIRFLOW_HOME}/astronomer_providers/pyproject.toml
---> Using cache
---> 178e19d6ac00
Step 7/8 : RUN pip install -e ${AIRFLOW_HOME}/astronomer_providers[tests]
---> Using cache
---> 0479829a1d57
Step 8/8 : USER astro
---> Using cache
---> bb6316f5c9f8
Successfully built bb6316f5c9f8
Successfully tagged astronomer-operators-dev:latest
Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them
docker run -t --rm astronomer-operators-dev\
/bin/bash -c "mypy --install-types --config-file astronomer_providers/setup.cfg ./astronomer_providers" \
-v ../../astronomer-operators:/usr/local/airflow/astronomer_providers
WARNING: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested
astronomer_providers/astronomer/providers/http/__init__.py: note: In function "__getattr__":
astronomer_providers/astronomer/providers/http/__init__.py:12: error: Function is missing a type annotation [no-untyped-def]
def __getattr__(name):
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/s3.py: note: In class "S3HookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/s3.py:8: error: Class cannot subclass "AwsBaseHookAsync" (has type "Any") [misc]
class S3HookAsync(AwsBaseHookAsync):
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/s3.py: note: In member "__init__" of class "S3HookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/s3.py:16: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(self, *args, **kwargs) -> None:
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py: note: In member "__init__" of class "S3KeyTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:15: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:23: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py: note: In member "_check_exact_key" of class "S3KeyTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:31: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
async def _check_exact_key(client, bucket, key) -> bool:
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py: note: In member "_check_wildcard_key" of class "S3KeyTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:49: error: Function is missing a type annotation [no-untyped-def]
async def _check_wildcard_key(client, bucket, wildcard_key):
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py: note: In member "_check_key" of class "S3KeyTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:68: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
async def _check_key(self, client, bucket, key, wildcard) -> bool:
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:78: error: Returning Any from function declared to return "bool" [no-any-return]
return await self._check_wildcard_key(client, bucket, key)
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:78: error: Call to untyped function "_check_wildcard_key" of "S3KeyTrigger" in typed context [no-untyped-call]
return await self._check_wildcard_key(client, bucket, key)
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py: note: In member "run" of class "S3KeyTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/s3.py:97: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py: note: In member "__init__" of class "BigQueryInsertJobTrigger":
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py:21: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py: note: In member "run" of class "BigQueryInsertJobTrigger":
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py:46: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py: note: In member "run" of class "BigQueryCheckTrigger":
astronomer_providers/astronomer/providers/google/cloud/triggers/bigquery.py:99: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/core/triggers/filesystem.py: note: In member "__init__" of class "FileTrigger":
astronomer_providers/astronomer/providers/core/triggers/filesystem.py:31: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/core/triggers/filesystem.py: note: In member "run" of class "FileTrigger":
astronomer_providers/astronomer/providers/core/triggers/filesystem.py:49: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/core/triggers/filesystem.py:57: error: Incompatible types in assignment (expression has type "str", variable has type "float") [assignment]
mod_time = datetime.datetime.fromtimestamp(mod_time).strftime("%Y%m%d%H%M%S")
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py: note: In class "RedshiftHookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py:12: error: Class cannot subclass "AwsBaseHookAsync" (has type "Any") [misc]
class RedshiftHookAsync(AwsBaseHookAsync):
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py: note: In member "__init__" of class "RedshiftHookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py:17: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(self, *args, **kwargs) -> None:
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py: note: In member "get_cluster_status" of class "RedshiftHookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/redshift_cluster.py:87: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
async def get_cluster_status(self, cluster_identifier, expected_state, flag) -> Dict[str, Any]:
^
astronomer_providers/astronomer/providers/google/cloud/triggers/gcs.py: note: In member "__init__" of class "GCSBlobTrigger":
astronomer_providers/astronomer/providers/google/cloud/triggers/gcs.py:29: error: Missing type parameters for generic type "dict" [type-arg]
hook_params: dict,
^
astronomer_providers/astronomer/providers/google/cloud/triggers/gcs.py:31: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/google/cloud/triggers/gcs.py: note: In member "run" of class "GCSBlobTrigger":
astronomer_providers/astronomer/providers/google/cloud/triggers/gcs.py:53: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py: note: In function "get_db_hook":
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py:10: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def get_db_hook(self) -> SnowflakeHookAsync:
^
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py: note: In member "__init__" of class "SnowflakeTrigger":
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py:29: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py: note: In member "run" of class "SnowflakeTrigger":
astronomer_providers/astronomer/providers/snowflake/triggers/snowflake_trigger.py:49: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/snowflake/example_dags/example_snowflake.py:45: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
start_date=days_ago(0),
^
astronomer_providers/astronomer/providers/http/triggers/http.py:4: error: Module "airflow" does not explicitly export attribute "AirflowException"; implicit reexport disabled [attr-defined]
from airflow import AirflowException
^
astronomer_providers/astronomer/providers/http/triggers/http.py: note: In member "__init__" of class "HttpTrigger":
astronomer_providers/astronomer/providers/http/triggers/http.py:46: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/http/triggers/http.py: note: In member "run" of class "HttpTrigger":
astronomer_providers/astronomer/providers/http/triggers/http.py:71: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/http/hooks/http.py: note: In member "__init__" of class "HttpHookAsync":
astronomer_providers/astronomer/providers/http/hooks/http.py:38: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/http/hooks/http.py: note: In member "_retryable_error_async" of class "HttpHookAsync":
astronomer_providers/astronomer/providers/http/hooks/http.py:143: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def _retryable_error_async(self, exception) -> bool:
^
astronomer_providers/astronomer/providers/http/hooks/http.py:155: error: Returning Any from function declared to return "bool" [no-any-return]
return exception.status >= 500
^
astronomer_providers/astronomer/providers/http/example_dags/example_http.py:6: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
with DAG("example_async_http_sensor", tags=["example", "async"], start_date=days_ago(2)) as dag:
^
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py: note: In member "__init__" of class "GoogleBaseHookAsync":
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py:10: error: Function is missing a type annotation [no-untyped-def]
def __init__(self, **kwargs):
^
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py: note: In member "get_sync_hook" of class "GoogleBaseHookAsync":
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py:14: error: Function is missing a return type annotation [no-untyped-def]
async def get_sync_hook(self):
^
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py: note: In member "service_file_as_context" of class "GoogleBaseHookAsync":
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py:23: error: Function is missing a return type annotation [no-untyped-def]
async def service_file_as_context(self):
^
astronomer_providers/astronomer/providers/google/common/hooks/base_google.py:24: error: Call to untyped function "get_sync_hook" in typed context [no-untyped-call]
sync_hook = await self.get_sync_hook()
^
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py: note: In member "__init__" of class "GCSObjectExistenceSensorAsync":
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py:55: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py: note: In member "execute" of class "GCSObjectExistenceSensorAsync":
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py:75: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py: note: In member "execute_complete" of class "GCSObjectExistenceSensorAsync":
astronomer_providers/astronomer/providers/google/cloud/sensors/gcs.py:88: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/databricks/triggers/databricks.py:4: error: Module "airflow" does not explicitly export attribute "AirflowException"; implicit reexport disabled [attr-defined]
from airflow import AirflowException
^
astronomer_providers/astronomer/providers/databricks/triggers/databricks.py: note: In member "__init__" of class "DatabricksTrigger":
astronomer_providers/astronomer/providers/databricks/triggers/databricks.py:20: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/databricks/triggers/databricks.py: note: In member "run" of class "DatabricksTrigger":
astronomer_providers/astronomer/providers/databricks/triggers/databricks.py:44: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/databricks/example_dags/example_databricks.py:24: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
start_date=days_ago(0),
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py:5: error: Module "airflow.models" does not explicitly export attribute "DagRun"; implicit reexport disabled [attr-defined]
from airflow.models import DagRun, TaskInstance
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py:5: error: Module "airflow.models" does not explicitly export attribute "TaskInstance"; implicit reexport disabled [attr-defined]
from airflow.models import DagRun, TaskInstance
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "__init__" of class "TaskStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:21: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "run" of class "TaskStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:43: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "count_tasks" of class "TaskStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:57: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def count_tasks(self, session) -> int:
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py:71: error: Returning Any from function declared to return "int" [no-any-return]
return count
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "__init__" of class "DagStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:82: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "run" of class "DagStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:102: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py: note: In member "count_dags" of class "DagStateTrigger":
astronomer_providers/astronomer/providers/core/triggers/external_task.py:117: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def count_dags(self, session) -> int:
^
astronomer_providers/astronomer/providers/core/triggers/external_task.py:130: error: Returning Any from function declared to return "int" [no-any-return]
return count
^
astronomer_providers/astronomer/providers/core/sensors/external_task.py: note: In member "execute" of class "ExternalTaskSensorAsync":
astronomer_providers/astronomer/providers/core/sensors/external_task.py:12: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/core/sensors/external_task.py:17: error: Call to untyped function "get_execution_dates" in typed context [no-untyped-call]
execution_dates = self.get_execution_dates(context)
^
astronomer_providers/astronomer/providers/core/sensors/external_task.py: note: In member "execute_complete" of class "ExternalTaskSensorAsync":
astronomer_providers/astronomer/providers/core/sensors/external_task.py:47: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, session, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/core/sensors/external_task.py:52: error: Call to untyped function "get_execution_dates" in typed context [no-untyped-call]
execution_dates = self.get_execution_dates(context)
^
astronomer_providers/astronomer/providers/core/sensors/external_task.py: note: In member "get_execution_dates" of class "ExternalTaskSensorAsync":
astronomer_providers/astronomer/providers/core/sensors/external_task.py:63: error: Function is missing a type annotation [no-untyped-def]
def get_execution_dates(self, context):
^
astronomer_providers/astronomer/providers/core/example_dags/example_file_sensor.py:6: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
with airflow.DAG("example_async_file_sensor", start_date=days_ago(1), tags=["async"]) as dag:
^
astronomer_providers/astronomer/providers/core/example_dags/example_external_task.py:18: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
default_args = {"start_date": days_ago(0)}
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py: note: In member "__init__" of class "RedshiftClusterTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py:19: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py: note: In member "run" of class "RedshiftClusterTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py:41: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py: note: In member "__init__" of class "RedshiftClusterSensorTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py:80: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py: note: In member "run" of class "RedshiftClusterSensorTrigger":
astronomer_providers/astronomer/providers/amazon/aws/triggers/redshift_cluster.py:102: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/amazon/aws/example_dags/example_s3.py:16: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
start_date=days_ago(3),
^
astronomer_providers/astronomer/providers/amazon/aws/example_dags/example_redshift_cluster_management.py:19: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
start_date=days_ago(1),
^
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py: note: In member "get_run_state_async" of class "DatabricksHookAsync":
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py:24: error: Call to untyped function "_do_api_call_async" in typed context [no-untyped-call]
response = await self._do_api_call_async(GET_RUN_ENDPOINT, json)
^
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py: note: In member "_do_api_call_async" of class "DatabricksHookAsync":
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py:34: error: Function is missing a type annotation [no-untyped-def]
async def _do_api_call_async(self, endpoint_info, json):
^
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py:100: error: Argument 2 to "_log_request_error" of "DatabricksHook" has incompatible type "ClientResponseError"; expected "str"
[arg-type]
self._log_request_error(attempt_num, e)
^
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py: note: In member "_retryable_error_async" of class "DatabricksHookAsync":
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py:112: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def _retryable_error_async(self, exception) -> bool:
^
astronomer_providers/astronomer/providers/databricks/hooks/databricks.py:126: error: Returning Any from function declared to return "bool" [no-any-return]
return exception.status >= 500
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py: note: In member "__init__" of class "WaitContainerTrigger":
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:49: error: Missing type parameters for generic type "dict" [type-arg]
hook_params: Optional[dict] = None,
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py: note: In member "wait_for_pod_start" of class "WaitContainerTrigger":
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:81: error: Function is missing a return type annotation [no-untyped-def]
async def wait_for_pod_start(self, v1_api: CoreV1Api):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py: note: In member "wait_for_container_completion" of class "WaitContainerTrigger":
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:95: error: Function is missing a return type annotation [no-untyped-def]
async def wait_for_container_completion(self, v1_api: CoreV1Api):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:99: error: Argument "container_name" to "container_is_running" has incompatible type "Optional[str]"; expected "str"
[arg-type]
if not container_is_running(pod=pod, container_name=self.container_name):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py: note: In member "run" of class "WaitContainerTrigger":
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:103: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py: note: In member "_format_exception_description" of class "WaitContainerTrigger":
astronomer_providers/astronomer/providers/cncf/kubernetes/triggers/wait_container.py:123: error: Function is missing a return type annotation [no-untyped-def]
def _format_exception_description(self, exc: Exception):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:2: error: Module "airflow" does not explicitly export attribute "AirflowException"; implicit reexport disabled
[attr-defined]
from airflow import AirflowException
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py: note: In member "_load_config" of class "KubernetesHookAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:8: error: Function is missing a return type annotation [no-untyped-def]
async def _load_config(self):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:20: error: Call to untyped function "_coalesce_param" of "KubernetesHook" in typed context [no-untyped-call]
in_cluster = self._coalesce_param(
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:23: error: Call to untyped function "_coalesce_param" of "KubernetesHook" in typed context [no-untyped-call]
cluster_context = self._coalesce_param(
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:26: error: Call to untyped function "_coalesce_param" of "KubernetesHook" in typed context [no-untyped-call]
kubeconfig_path = self._coalesce_param(
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:53: error: Module has no attribute "tempfile" [attr-defined]
async with aiofiles.tempfile.NamedTemporaryFile() as temp_config:
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py: note: In member "get_api_client_async" of class "KubernetesHookAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:70: error: Function is missing a return type annotation [no-untyped-def]
async def get_api_client_async(self):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:71: error: Call to untyped function "_load_config" in typed context [no-untyped-call]
await self._load_config()
^
astronomer_providers/astronomer/providers/amazon/aws/hooks/base_aws_async.py: note: In member "get_client_async" of class "AwsBaseHookAsync":
astronomer_providers/astronomer/providers/amazon/aws/hooks/base_aws_async.py:15: error: Function is missing a return type annotation [no-untyped-def]
async def get_client_async(self):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py: note: In member "run" of class "SnowflakeHookAsync":
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: error: Function is missing a return type annotation [no-untyped-def]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: error: Signature of "run" incompatible with supertype "SnowflakeHook" [override]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: Superclass:
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Union[Dict[Any, Any], Iterable[Any], None] = ..., handler: Optional[Callable[..., Any]] = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: Subclass:
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Optional[Dict[Any, Any]] = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Union[Dict[Any, Any], Iterable[Any], None] = ..., handler: Optional[Callable[..., Any]] = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Optional[Dict[Any, Any]] = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: error: Signature of "run" incompatible with supertype "DbApiHook" [override]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Any, autocommit: Any = ..., parameters: Any = ..., handler: Any = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Optional[Dict[Any, Any]] = ...) -> Any
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: error: Missing type parameters for generic type "list" [type-arg]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:14: error: Missing type parameters for generic type "dict" [type-arg]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:56: error: Call to untyped function "get_autocommit" in typed context [no-untyped-call]
if not self.get_autocommit(conn):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py: note: In member "check_query_output" of class "SnowflakeHookAsync":
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:60: error: Function is missing a return type annotation [no-untyped-def]
def check_query_output(self, query_ids: List[str]):
^
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py: note: In member "get_query_status" of class "SnowflakeHookAsync":
astronomer_providers/astronomer/providers/snowflake/hooks/snowflake.py:72: error: Function is missing a return type annotation [no-untyped-def]
async def get_query_status(self, query_ids: List[str]):
^
astronomer_providers/astronomer/providers/http/operators/http.py: note: In member "execute" of class "HttpSensorAsync":
astronomer_providers/astronomer/providers/http/operators/http.py:7: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/http/operators/http.py: note: In member "execute_complete" of class "HttpSensorAsync":
astronomer_providers/astronomer/providers/http/operators/http.py:30: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/databricks/operators/databricks.py: note: In member "execute" of class "DatabricksSubmitRunOperatorAsync":
astronomer_providers/astronomer/providers/databricks/operators/databricks.py:12: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/databricks/operators/databricks.py: note: In member "execute_complete" of class "DatabricksSubmitRunOperatorAsync":
astronomer_providers/astronomer/providers/databricks/operators/databricks.py:47: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/databricks/operators/databricks.py: note: In member "execute" of class "DatabricksRunNowOperatorAsync":
astronomer_providers/astronomer/providers/databricks/operators/databricks.py:58: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/databricks/operators/databricks.py: note: In member "execute_complete" of class "DatabricksRunNowOperatorAsync":
astronomer_providers/astronomer/providers/databricks/operators/databricks.py:92: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/core/sensors/filesystem.py: note: In member "execute" of class "FileSensorAsync":
astronomer_providers/astronomer/providers/core/sensors/filesystem.py:29: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/core/sensors/filesystem.py:30: error: Call to untyped function "poke" in typed context [no-untyped-call]
if not self.poke(context=context):
^
astronomer_providers/astronomer/providers/core/sensors/filesystem.py:31: error: Call to untyped function "FSHook" in typed context [no-untyped-call]
hook = FSHook(self.fs_conn_id)
^
astronomer_providers/astronomer/providers/core/sensors/filesystem.py: note: In member "execute_complete" of class "FileSensorAsync":
astronomer_providers/astronomer/providers/core/sensors/filesystem.py:46: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/core/example_dags/example_external_task_wait_for_me.py:9: error: Call to untyped function "days_ago" in typed context [no-untyped-call]
default_args = {"start_date": days_ago(0)}
^
astronomer_providers/astronomer/providers/core/example_dags/example_external_task_wait_for_me.py:15: error: Call to untyped function "TimeSensorAsync" in typed context [no-untyped-call]
wait_for_me = TimeSensorAsync(
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:1: error: Module "airflow" does not explicitly export attribute "AirflowException"; implicit reexport disabled
[attr-defined]
from airflow import AirflowException
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py: note: In class "PodNotFoundException":
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:12: error: Class cannot subclass "AirflowException" (has type "Any") [misc]
class PodNotFoundException(AirflowException):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py: note: In member "__init__" of class "KubernetesPodOperatorAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:23: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(self, *, poll_interval: int = 5, **kwargs):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py: note: In member "raise_for_trigger_status" of class "KubernetesPodOperatorAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:28: error: Function is missing a type annotation [no-untyped-def]
def raise_for_trigger_status(event):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py: note: In member "execute" of class "KubernetesPodOperatorAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:37: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:38: error: Call to untyped function "build_pod_request_obj" in typed context [no-untyped-call]
self.pod_request_obj = self.build_pod_request_obj(context)
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py: note: In member "execute_complete" of class "KubernetesPodOperatorAsync":
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:57: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None):
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:60: error: Call to untyped function "build_pod_request_obj" in typed context [no-untyped-call]
self.pod_request_obj = self.build_pod_request_obj(context)
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:67: error: Call to untyped function "raise_for_trigger_status" of "KubernetesPodOperatorAsync" in typed context
[no-untyped-call]
self.raise_for_trigger_status(event)
^
astronomer_providers/astronomer/providers/cncf/kubernetes/operators/kubernetes_pod.py:78: error: Call to untyped function "extract_xcom" in typed context [no-untyped-call]
result = self.extract_xcom(pod=self.pod)
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:6: error: Module "airflow.models" does not explicitly export attribute "BaseOperator"; implicit reexport disabled [attr-defined]
from airflow.models import BaseOperator
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: In class "S3KeySensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:14: error: Class cannot subclass "BaseOperator" (has type "Any") [misc]
class S3KeySensorAsync(BaseOperator):
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: In member "__init__" of class "S3KeySensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:45: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: In member "_resolve_bucket_and_key" of class "S3KeySensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:64: error: Function is missing a return type annotation [no-untyped-def]
def _resolve_bucket_and_key(self):
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:64: note: Use "-> None" if function does not return a value
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: In member "execute" of class "S3KeySensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:73: error: Missing type parameters for generic type "Dict" [type-arg]
def execute(self, context: Dict) -> Any:
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:74: error: Call to untyped function "_resolve_bucket_and_key" in typed context [no-untyped-call]
self._resolve_bucket_and_key()
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: At top level:
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:78: error: Unused "type: ignore" comment
bucket_name=self.bucket_name, # type: ignore
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py: note: In member "execute_complete" of class "S3KeySensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:87: error: Function is missing a return type annotation [no-untyped-def]
def execute_complete(self, context: Dict, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:87: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def execute_complete(self, context: Dict, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/s3.py:87: error: Missing type parameters for generic type "Dict" [type-arg]
def execute_complete(self, context: Dict, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py: note: In member "__init__" of class "SnowflakeOperatorAsync":
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py:12: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(self, *, poll_interval: int = 5, **kwargs):
^
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py: note: In member "execute" of class "SnowflakeOperatorAsync":
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py:20: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py: note: In member "execute_complete" of class "SnowflakeOperatorAsync":
astronomer_providers/astronomer/providers/snowflake/operators/snowflake.py:44: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py: note: In member "__init__" of class "RedshiftClusterSensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:25: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py: note: In member "execute" of class "RedshiftClusterSensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:34: error: Function is missing a return type annotation [no-untyped-def]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:34: error: Argument 1 of "execute" is incompatible with supertype "BaseSensorOperator"; supertype defines the argument
type as "Dict[Any, Any]" [override]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:34: note: This violates the Liskov substitution principle
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:34: note: See https://mypy.readthedocs.io/en/stable/common_issues.html#incompatible-overrides
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py: note: In member "execute_complete" of class "RedshiftClusterSensorAsync":
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:47: error: Function is missing a return type annotation [no-untyped-def]
def execute_complete(self, context: "Context", event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/amazon/aws/sensors/redshift_cluster_sensor.py:47: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def execute_complete(self, context: "Context", event=None): # pylint: disable=unused-argument
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "__init__" of class "RedshiftResumeClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:26: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "execute" of class "RedshiftResumeClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:35: error: Function is missing a return type annotation [no-untyped-def]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "execute_complete" of class "RedshiftResumeClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:55: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None):
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "__init__" of class "RedshiftPauseClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:82: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "execute" of class "RedshiftPauseClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:91: error: Function is missing a return type annotation [no-untyped-def]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py: note: In member "execute_complete" of class "RedshiftPauseClusterOperatorAsync":
astronomer_providers/astronomer/providers/amazon/aws/operators/redshift_cluster.py:111: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:24: error: Module "airflow.models" does not explicitly export attribute "BaseOperator"; implicit reexport disabled [attr-defined]
from airflow.models import BaseOperator
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py: note: In class "BigQueryInsertJobOperatorAsync":
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:48: error: Class cannot subclass "BaseOperator" (has type "Any") [misc]
class BigQueryInsertJobOperatorAsync(BigQueryInsertJobOperator, BaseOperator):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py: note: In member "execute" of class "BigQueryInsertJobOperatorAsync":
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:113: error: Function is missing a return type annotation [no-untyped-def]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:122: error: Call to untyped function "_job_id" in typed context [no-untyped-call]
job_id = self._job_id(context)
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py: note: In member "execute_complete" of class "BigQueryInsertJobOperatorAsync":
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:158: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py: note: In member "execute" of class "BigQueryCheckOperatorAsync":
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: error: Function is missing a return type annotation [no-untyped-def]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: error: Signature of "execute" incompatible with supertype "SQLCheckOperator" [override]
def execute(self, context: "Context"):
^
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: note: Superclass:
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: note: def execute(self, context: Any = ...) -> Any
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: note: Subclass:
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:186: note: def execute(self, context: Context) -> Any
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py: note: In member "execute_complete" of class "BigQueryCheckOperatorAsync":
astronomer_providers/astronomer/providers/google/cloud/operators/bigquery.py:202: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None):
^
astronomer_providers/astronomer/providers/google/cloud/example_dags/example_gcs.py:44: error: Module has no attribute "DAG" [attr-defined]
with models.DAG(
^
astronomer_providers/astronomer/providers/google/cloud/example_dags/example_bigquery_queries.py:62: error: Module has no attribute "DAG" [attr-defined]
with models.DAG(
^
astronomer_providers/astronomer/providers/google/cloud/hooks/gcs.py: note: In class "GCSHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/gcs.py:29: error: Class cannot subclass "GoogleBaseHookAsync" (has type "Any") [misc]
class GCSHookAsync(GoogleBaseHookAsync):
^
astronomer_providers/astronomer/providers/google/cloud/hooks/gcs.py: note: In member "get_storage_client" of class "GCSHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/gcs.py:32: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
async def get_storage_client(self, session) -> Storage:
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "insert_job" of class "_BigQueryHook":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:40: error: Missing type parameters for generic type "Dict" [type-arg]
configuration: Dict,
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In class "BigQueryHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:98: error: Class cannot subclass "GoogleBaseHookAsync" (has type "Any") [misc]
class BigQueryHookAsync(GoogleBaseHookAsync):
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "get_job_instance" of class "BigQueryHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:101: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
async def get_job_instance(self, project_id, job_id, session) -> Job:
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "get_job_status" of class "BigQueryHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:106: error: Function is missing a return type annotation [no-untyped-def]
async def get_job_status(
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:118: error: Argument 1 to "result" of "Job" has incompatible type "ClientSession"; expected "Optional[Session]" [arg-type]
job_status_response = await job_client.result(s)
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "get_job_output" of class "BigQueryHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:132: error: Missing type parameters for generic type "Dict" [type-arg]
) -> Dict:
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:140: error: Argument 1 to "get_query_results" of "Job" has incompatible type "ClientSession"; expected "Optional[Session]" [arg-type]
job_query_response = await job_client.get_query_results(session)
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "get_records" of class "BigQueryHookAsync":
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:143: error: Missing type parameters for generic type "Dict" [type-arg]
def get_records(self, query_results: Dict, nocast: bool = True) -> list:
^
astronomer_providers/astronomer/providers/google/cloud/hooks/bigquery.py:143: error: Missing type parameters for generic type "list" [type-arg]
def get_records(self, query_results: Dict, nocast: bool = True) -> list:
^
Found 163 errors in 42 files (checked 83 source files)
make: *** [run-mypy] Error 1
from astronomer-providers.
Completed testing Databricks Dag with both the async operators post resolving mypy errors. Below are the logs.
Task logs
[2022-02-28, 16:13:17 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: databricks_dag.submit_run manual__2022-02-28T16:11:32.990347+00:00 [queued]>
[2022-02-28, 16:13:17 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: databricks_dag.submit_run manual__2022-02-28T16:11:32.990347+00:00 [queued]>
[2022-02-28, 16:13:17 UTC] {taskinstance.py:1243} INFO -
[2022-02-28, 16:13:17 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 2
[2022-02-28, 16:13:17 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-02-28, 16:13:17 UTC] {taskinstance.py:1264} INFO - Executing <Task(DatabricksSubmitRunOperatorAsync): submit_run> on 2022-02-28 16:11:32.990347+00:00
[2022-02-28, 16:13:17 UTC] {standard_task_runner.py:52} INFO - Started process 54054 to run task
[2022-02-28, 16:13:17 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'databricks_dag', 'submit_run', 'manual__2022-02-28T16:11:32.990347+00:00', '--job-id', '64', '--raw', '--subdir', 'DAGS_FOLDER/example_databricks.py', '--cfg-path', '/tmp/tmpzcbmi87v', '--error-file', '/tmp/tmp2y3imh0a']
[2022-02-28, 16:13:17 UTC] {standard_task_runner.py:77} INFO - Job 64: Subtask submit_run
[2022-02-28, 16:13:17 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: databricks_dag.submit_run manual__2022-02-28T16:11:32.990347+00:00 [running]> on host 1f9768a12b94
[2022-02-28, 16:13:17 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=databricks_dag
AIRFLOW_CTX_TASK_ID=submit_run
AIRFLOW_CTX_EXECUTION_DATE=2022-02-28T16:11:32.990347+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-02-28T16:11:32.990347+00:00
[2022-02-28, 16:13:17 UTC] {databricks.py:56} INFO - submit_run completed successfully.
[2022-02-28, 16:13:17 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=databricks_dag, task_id=submit_run, execution_date=20220228T161132, start_date=20220228T161317, end_date=20220228T161317
[2022-02-28, 16:13:18 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-02-28, 16:13:18 UTC] {local_task_job.py:264} INFO - 1 downstream tasks scheduled from follow-on schedule check
*** Reading local file: /usr/local/airflow/logs/databricks_dag/run_now/2022-02-28T16:11:32.990347+00:00/1.log
[2022-02-28, 16:13:58 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: databricks_dag.run_now manual__2022-02-28T16:11:32.990347+00:00 [queued]>
[2022-02-28, 16:13:58 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: databricks_dag.run_now manual__2022-02-28T16:11:32.990347+00:00 [queued]>
[2022-02-28, 16:13:58 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-02-28, 16:13:58 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 2
[2022-02-28, 16:13:58 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-02-28, 16:13:58 UTC] {taskinstance.py:1264} INFO - Executing <Task(DatabricksRunNowOperatorAsync): run_now> on 2022-02-28 16:11:32.990347+00:00
[2022-02-28, 16:13:58 UTC] {standard_task_runner.py:52} INFO - Started process 54097 to run task
[2022-02-28, 16:13:58 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'databricks_dag', 'run_now', 'manual__2022-02-28T16:11:32.990347+00:00', '--job-id', '66', '--raw', '--subdir', 'DAGS_FOLDER/example_databricks.py', '--cfg-path', '/tmp/tmpox_2n7lx', '--error-file', '/tmp/tmprgy199vo']
[2022-02-28, 16:13:58 UTC] {standard_task_runner.py:77} INFO - Job 66: Subtask run_now
[2022-02-28, 16:13:58 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: databricks_dag.run_now manual__2022-02-28T16:11:32.990347+00:00 [running]> on host 1f9768a12b94
[2022-02-28, 16:13:59 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=databricks_dag
AIRFLOW_CTX_TASK_ID=run_now
AIRFLOW_CTX_EXECUTION_DATE=2022-02-28T16:11:32.990347+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-02-28T16:11:32.990347+00:00
[2022-02-28, 16:13:59 UTC] {databricks.py:101} INFO - run_now completed successfully.
[2022-02-28, 16:13:59 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=databricks_dag, task_id=run_now, execution_date=20220228T161132, start_date=20220228T161358, end_date=20220228T161359
[2022-02-28, 16:13:59 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-02-28, 16:13:59 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
Triggerer logs
[2022-02-28 17:26:42,892] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:26:44,668] {databricks.py:31} INFO - Getting run state.
[2022-02-28 17:26:44,671] {databricks.py:60} INFO - submit_run in run state: {'life_cycle_state': 'PENDING', 'result_state': None, 'state_message': 'Waiting for cluster'}
[2022-02-28 17:26:44,675] {databricks.py:61} INFO - Sleeping for 30 seconds.
[2022-02-28 17:27:14,678] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:27:16,395] {databricks.py:31} INFO - Getting run state.
[2022-02-28 17:27:16,396] {databricks.py:60} INFO - submit_run in run state: {'life_cycle_state': 'PENDING', 'result_state': None, 'state_message': 'Waiting for cluster'}
[2022-02-28 17:27:16,397] {databricks.py:61} INFO - Sleeping for 30 seconds.
[2022-02-28 17:27:34,003] {triggerer_job.py:251} INFO - 1 triggers currently running
[2022-02-28 17:27:46,398] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:27:48,490] {databricks.py:31} INFO - Getting run state.
[2022-02-28 17:27:48,491] {databricks.py:60} INFO - submit_run in run state: {'life_cycle_state': 'PENDING', 'result_state': None, 'state_message': 'Waiting for cluster'}
[2022-02-28 17:27:48,492] {databricks.py:61} INFO - Sleeping for 30 seconds.
[2022-02-28 17:28:18,494] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:28:20,302] {databricks.py:31} INFO - Getting run state.
[2022-02-28 17:28:20,303] {databricks.py:60} INFO - submit_run in run state: {'life_cycle_state': 'PENDING', 'result_state': None, 'state_message': 'Waiting for cluster'}
[2022-02-28 17:28:20,303] {databricks.py:61} INFO - Sleeping for 30 seconds.
[2022-02-28 17:28:34,123] {triggerer_job.py:251} INFO - 1 triggers currently running
[2022-02-28 17:28:50,292] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:28:51,805] {databricks.py:31} INFO - Getting run state.
[2022-02-28 17:28:51,822] {databricks.py:60} INFO - submit_run in run state: {'life_cycle_state': 'PENDING', 'result_state': None, 'state_message': 'Waiting for cluster'}
[2022-02-28 17:28:51,823] {databricks.py:61} INFO - Sleeping for 30 seconds.
[2022-02-28 17:29:21,828] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:29:23,693] {databricks.py:31} INFO - Getting run state.
[2022-02-28 17:29:23,694] {databricks.py:60} INFO - submit_run in run state: {'life_cycle_state': 'PENDING', 'result_state': None, 'state_message': 'Waiting for cluster'}
[2022-02-28 17:29:23,695] {databricks.py:61} INFO - Sleeping for 30 seconds.
[2022-02-28 17:29:34,220] {triggerer_job.py:251} INFO - 1 triggers currently running
[2022-02-28 17:29:53,695] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:29:55,498] {databricks.py:31} INFO - Getting run state.
[2022-02-28 17:29:55,499] {databricks.py:60} INFO - submit_run in run state: {'life_cycle_state': 'PENDING', 'result_state': None, 'state_message': 'Waiting for cluster'}
[2022-02-28 17:29:55,499] {databricks.py:61} INFO - Sleeping for 30 seconds.
[2022-02-28 17:30:25,502] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:30:27,241] {databricks.py:31} INFO - Getting run state.
[2022-02-28 17:30:27,242] {databricks.py:60} INFO - submit_run in run state: {'life_cycle_state': 'RUNNING', 'result_state': None, 'state_message': 'In run'}
[2022-02-28 17:30:27,243] {databricks.py:61} INFO - Sleeping for 30 seconds.
[2022-02-28 17:30:34,346] {triggerer_job.py:251} INFO - 1 triggers currently running
[2022-02-28 17:30:57,250] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:30:59,039] {databricks.py:31} INFO - Getting run state.
[2022-02-28 17:30:59,040] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.databricks.triggers.databricks.DatabricksTrigger conn_id=databricks_default, task_id=submit_run, run_id=316, retry_limit=3, retry_delay=1, polling_period_seconds=30> (ID 21) fired: TriggerEvent<True>
[2022-02-28 17:31:00,400] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.databricks.triggers.databricks.DatabricksTrigger conn_id=databricks_default, task_id=submit_run, run_id=316, retry_limit=3, retry_delay=1, polling_period_seconds=30> (ID 21) starting
[2022-02-28 17:31:00,428] {base.py:70} INFO - Using connection to: id: databricks_default. Host: dbc-dd61443a-746c.cloud.databricks.com, Port: None, Schema: , Login: [email protected], Password: ****, extra: {'token': '***'}
[2022-02-28 17:31:00,429] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:31:06,451] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.databricks.triggers.databricks.DatabricksTrigger conn_id=databricks_default, task_id=run_now, run_id=386, retry_limit=3, retry_delay=1, polling_period_seconds=30> (ID 22) starting
[2022-02-28 17:31:06,463] {base.py:70} INFO - Using connection to: id: databricks_default. Host: dbc-dd61443a-746c.cloud.databricks.com, Port: None, Schema: , Login: [email protected], Password: ****, extra: {'token': '***'}
[2022-02-28 17:31:06,464] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:31:07,896] {databricks.py:31} INFO - Getting run state.
[2022-02-28 17:31:07,897] {databricks.py:60} INFO - run_now in run state: {'life_cycle_state': 'PENDING', 'result_state': None, 'state_message': ''}
[2022-02-28 17:31:07,898] {databricks.py:61} INFO - Sleeping for 30 seconds.
[2022-02-28 17:31:07,897] {databricks.py:60} INFO - run_now in run state: {'life_cycle_state': 'PENDING', 'result_state': None, 'state_message': ''}
[2022-02-28 17:31:07,898] {databricks.py:61} INFO - Sleeping for 30 seconds.
[2022-02-28 17:31:34,522] {triggerer_job.py:251} INFO - 1 triggers currently running
[2022-02-28 17:31:37,900] {databricks.py:57} INFO - Using token auth.
[2022-02-28 17:31:39,292] {databricks.py:31} INFO - Getting run state.
[2022-02-28 17:31:39,293] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.databricks.triggers.databricks.DatabricksTrigger conn_id=databricks_default, task_id=run_now, run_id=386, retry_limit=3, retry_delay=1, polling_period_seconds=30> (ID 22) fired: TriggerEvent<True>
from astronomer-providers.
Completed test for HTTPSensorAsync example DAG:
[2022-02-28, 13:43:41 UTC] {warnings.py:109} WARNING - /usr/local/lib/python3.9/site-packages/***/utils/context.py:152: AirflowContextDeprecationWarning: Accessing 'yesterday_ds_nodash' from the template is deprecated and will be removed in a future version.
warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))
[2022-02-28, 13:43:46 UTC] {http.py:101} INFO - Poking:
[2022-02-28, 13:43:46 UTC] {base.py:70} INFO - Using connection to: id: http_default. Host: randomuser.me, Port: None, Schema: , Login: , Password: None, extra: {}
[2022-02-28, 13:43:46 UTC] {http.py:140} INFO - Sending 'GET' to url: http://randomuser.me
[2022-02-28, 13:43:52 UTC] {http.py:101} INFO - Poking:
[2022-02-28, 13:43:52 UTC] {base.py:70} INFO - Using connection to: id: http_default. Host: randomuser.me, Port: None, Schema: , Login: , Password: None, extra: {}
[2022-02-28, 13:43:52 UTC] {http.py:140} INFO - Sending 'GET' to url: http://randomuser.me
[2022-02-28, 13:43:57 UTC] {http.py:101} INFO - Poking:
[2022-02-28, 13:43:57 UTC] {base.py:70} INFO - Using connection to: id: http_default. Host: randomuser.me, Port: None, Schema: , Login: , Password: None, extra: {}
[2022-02-28, 13:43:58 UTC] {http.py:140} INFO - Sending 'GET' to url: http://randomuser.me
[2022-02-28, 13:44:03 UTC] {http.py:101} INFO - Poking:
[2022-02-28, 13:44:03 UTC] {base.py:70} INFO - Using connection to: id: http_default. Host: randomuser.me, Port: None, Schema: , Login: , Password: None, extra: {}
[2022-02-28, 13:44:03 UTC] {http.py:140} INFO - Sending 'GET' to url: http://randomuser.me
from astronomer-providers.
Completed test for S3KeySensorAsync
example DAG
*** Reading local file: /usr/local/airflow/logs/example_s3_key_sensor/waiting_for_s3_key/2022-02-28T20:46:05.103221+00:00/1.log
[2022-02-28, 20:46:08 UTC] {taskinstance.py:1032} INFO - Dependencies all met for <TaskInstance: example_s3_key_sensor.waiting_for_s3_key manual__2022-02-28T20:46:05.103221+00:00 [queued]>
[2022-02-28, 20:46:08 UTC] {taskinstance.py:1032} INFO - Dependencies all met for <TaskInstance: example_s3_key_sensor.waiting_for_s3_key manual__2022-02-28T20:46:05.103221+00:00 [queued]>
[2022-02-28, 20:46:08 UTC] {taskinstance.py:1238} INFO -
--------------------------------------------------------------------------------
[2022-02-28, 20:46:08 UTC] {taskinstance.py:1239} INFO - Starting attempt 1 of 1
[2022-02-28, 20:46:08 UTC] {taskinstance.py:1240} INFO -
--------------------------------------------------------------------------------
[2022-02-28, 20:46:08 UTC] {taskinstance.py:1259} INFO - Executing <Task(S3KeySensorAsync): waiting_for_s3_key> on 2022-02-28 20:46:05.103221+00:00
[2022-02-28, 20:46:08 UTC] {standard_task_runner.py:52} INFO - Started process 165 to run task
[2022-02-28, 20:46:08 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_s3_key_sensor', 'waiting_for_s3_key', 'manual__2022-02-28T20:46:05.103221+00:00', '--job-id', '206', '--raw', '--subdir', 'DAGS_FOLDER/example_s3.py', '--cfg-path', '/tmp/tmpsotc9m7d', '--error-file', '/tmp/tmpgl61blbm']
[2022-02-28, 20:46:08 UTC] {standard_task_runner.py:77} INFO - Job 206: Subtask waiting_for_s3_key
[2022-02-28, 20:46:08 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_s3_key_sensor.waiting_for_s3_key manual__2022-02-28T20:46:05.103221+00:00 [running]> on host 16261eee155b
[2022-02-28, 20:46:09 UTC] {taskinstance.py:1424} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_s3_key_sensor
AIRFLOW_CTX_TASK_ID=waiting_for_s3_key
AIRFLOW_CTX_EXECUTION_DATE=2022-02-28T20:46:05.103221+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-02-28T20:46:05.103221+00:00
[2022-02-28, 20:46:09 UTC] {taskinstance.py:1337} INFO - Pausing task as DEFERRED. dag_id=example_s3_key_sensor, task_id=waiting_for_s3_key, execution_date=20220228T204605, start_date=20220228T204608
[2022-02-28, 20:46:09 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-02-28, 20:46:09 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
[2022-02-28, 20:46:17 UTC] {taskinstance.py:1032} INFO - Dependencies all met for <TaskInstance: example_s3_key_sensor.waiting_for_s3_key manual__2022-02-28T20:46:05.103221+00:00 [queued]>
[2022-02-28, 20:46:17 UTC] {taskinstance.py:1032} INFO - Dependencies all met for <TaskInstance: example_s3_key_sensor.waiting_for_s3_key manual__2022-02-28T20:46:05.103221+00:00 [queued]>
[2022-02-28, 20:46:17 UTC] {taskinstance.py:1238} INFO -
--------------------------------------------------------------------------------
[2022-02-28, 20:46:17 UTC] {taskinstance.py:1239} INFO - Starting attempt 1 of 1
[2022-02-28, 20:46:17 UTC] {taskinstance.py:1240} INFO -
--------------------------------------------------------------------------------
[2022-02-28, 20:46:17 UTC] {taskinstance.py:1259} INFO - Executing <Task(S3KeySensorAsync): waiting_for_s3_key> on 2022-02-28 20:46:05.103221+00:00
[2022-02-28, 20:46:17 UTC] {standard_task_runner.py:52} INFO - Started process 179 to run task
[2022-02-28, 20:46:17 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_s3_key_sensor', 'waiting_for_s3_key', 'manual__2022-02-28T20:46:05.103221+00:00', '--job-id', '207', '--raw', '--subdir', 'DAGS_FOLDER/example_s3.py', '--cfg-path', '/tmp/tmpizko8f8z', '--error-file', '/tmp/tmp6ee4f1re']
[2022-02-28, 20:46:17 UTC] {standard_task_runner.py:77} INFO - Job 207: Subtask waiting_for_s3_key
[2022-02-28, 20:46:17 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_s3_key_sensor.waiting_for_s3_key manual__2022-02-28T20:46:05.103221+00:00 [running]> on host 16261eee155b
[2022-02-28, 20:46:18 UTC] {taskinstance.py:1424} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_s3_key_sensor
AIRFLOW_CTX_TASK_ID=waiting_for_s3_key
AIRFLOW_CTX_EXECUTION_DATE=2022-02-28T20:46:05.103221+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-02-28T20:46:05.103221+00:00
[2022-02-28, 20:46:18 UTC] {taskinstance.py:1267} INFO - Marking task as SUCCESS. dag_id=example_s3_key_sensor, task_id=waiting_for_s3_key, execution_date=20220228T204605, start_date=20220228T204617, end_date=20220228T204618
[2022-02-28, 20:46:18 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-02-28, 20:46:18 UTC] {local_task_job.py:264} INFO - 1 downstream tasks scheduled from follow-on schedule check
from astronomer-providers.
Task logs after google bigquery operator changes
[2022-03-01, 10:24:57 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.create-dataset manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:24:57 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.create-dataset manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:24:57 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:24:57 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:24:57 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:24:57 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryCreateEmptyDatasetOperator): create-dataset> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:24:57 UTC] {standard_task_runner.py:52} INFO - Started process 3535 to run task
[2022-03-01, 10:24:57 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'create-dataset', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '95', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmprv8aqfw_', '--error-file', '/tmp/tmp6_w71yxu']
[2022-03-01, 10:24:57 UTC] {standard_task_runner.py:77} INFO - Job 95: Subtask create-dataset
[2022-03-01, 10:24:57 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.create-dataset manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:24:58 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=create-dataset
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:24:58 UTC] {bigquery.py:457} INFO - datasetId was not specified in `dataset_reference`. Will use default value astro_dataset.
[2022-03-01, 10:24:58 UTC] {bigquery.py:457} INFO - projectId was not specified in `dataset_reference`. Will use default value astronomer-***-providers.
[2022-03-01, 10:24:58 UTC] {bigquery.py:467} INFO - Creating dataset: astro_dataset in project: astronomer-***-providers
[2022-03-01, 10:24:59 UTC] {bigquery.py:469} INFO - Dataset created successfully.
[2022-03-01, 10:24:59 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=create-dataset, execution_date=20220301T102453, start_date=20220301T102457, end_date=20220301T102459
[2022-03-01, 10:24:59 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:24:59 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:25:02 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.create_table_1 manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:02 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.create_table_1 manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:02 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:02 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:25:02 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:02 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryCreateEmptyTableOperator): create_table_1> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:25:02 UTC] {standard_task_runner.py:52} INFO - Started process 3539 to run task
[2022-03-01, 10:25:02 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'create_table_1', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '96', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmp42teb0qt', '--error-file', '/tmp/tmpr5g2lzkt']
[2022-03-01, 10:25:02 UTC] {standard_task_runner.py:77} INFO - Job 96: Subtask create_table_1
[2022-03-01, 10:25:03 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.create_table_1 manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:25:03 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=create_table_1
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:25:03 UTC] {bigquery.py:961} INFO - Creating table
[2022-03-01, 10:25:04 UTC] {bigquery.py:976} INFO - Table astronomer-***-providers.astro_dataset.table1 created successfully
[2022-03-01, 10:25:04 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=create_table_1, execution_date=20220301T102453, start_date=20220301T102502, end_date=20220301T102504
[2022-03-01, 10:25:04 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:25:05 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:25:16 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.insert_query_job manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:16 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.insert_query_job manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:16 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:16 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:25:16 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:16 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryInsertJobOperatorAsync): insert_query_job> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:25:16 UTC] {standard_task_runner.py:52} INFO - Started process 3560 to run task
[2022-03-01, 10:25:16 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'insert_query_job', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '98', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmpl6md91_j', '--error-file', '/tmp/tmptgbkti9h']
[2022-03-01, 10:25:16 UTC] {standard_task_runner.py:77} INFO - Job 98: Subtask insert_query_job
[2022-03-01, 10:25:17 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.insert_query_job manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:25:17 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=insert_query_job
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:25:17 UTC] {bigquery.py:160} INFO - insert_query_job completed with response Job completed
[2022-03-01, 10:25:17 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=insert_query_job, execution_date=20220301T102453, start_date=20220301T102516, end_date=20220301T102517
[2022-03-01, 10:25:17 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:25:17 UTC] {local_task_job.py:264} INFO - 4 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.select_query_job manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.select_query_job manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:37 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryInsertJobOperatorAsync): select_query_job> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:25:37 UTC] {standard_task_runner.py:52} INFO - Started process 3606 to run task
[2022-03-01, 10:25:37 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'select_query_job', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '105', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmpxirl95sb', '--error-file', '/tmp/tmptki9evkv']
[2022-03-01, 10:25:37 UTC] {standard_task_runner.py:77} INFO - Job 105: Subtask select_query_job
[2022-03-01, 10:25:37 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.select_query_job manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:25:38 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=select_query_job
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:25:38 UTC] {bigquery.py:160} INFO - select_query_job completed with response Job completed
[2022-03-01, 10:25:39 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=select_query_job, execution_date=20220301T102453, start_date=20220301T102536, end_date=20220301T102539
[2022-03-01, 10:25:39 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:25:39 UTC] {local_task_job.py:264} INFO - 1 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:25:58 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.check_count manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:58 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.check_count manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:58 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:58 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:25:58 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:58 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryCheckOperatorAsync): check_count> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:25:58 UTC] {standard_task_runner.py:52} INFO - Started process 3636 to run task
[2022-03-01, 10:25:58 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'check_count', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '109', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmp9qx479py', '--error-file', '/tmp/tmprxbypoit']
[2022-03-01, 10:25:58 UTC] {standard_task_runner.py:77} INFO - Job 109: Subtask check_count
[2022-03-01, 10:25:59 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.check_count manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:25:59 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=check_count
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:25:59 UTC] {bigquery.py:209} INFO - Record: [2]
[2022-03-01, 10:25:59 UTC] {bigquery.py:210} INFO - Success.
[2022-03-01, 10:25:59 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=check_count, execution_date=20220301T102453, start_date=20220301T102558, end_date=20220301T102559
[2022-03-01, 10:26:00 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:26:00 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.execute_query_save manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.execute_query_save manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryInsertJobOperatorAsync): execute_query_save> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:25:36 UTC] {standard_task_runner.py:52} INFO - Started process 3603 to run task
[2022-03-01, 10:25:36 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'execute_query_save', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '104', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmpk8qh921c', '--error-file', '/tmp/tmph1421pw4']
[2022-03-01, 10:25:36 UTC] {standard_task_runner.py:77} INFO - Job 104: Subtask execute_query_save
[2022-03-01, 10:25:37 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.execute_query_save manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:25:38 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=execute_query_save
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:25:38 UTC] {bigquery.py:160} INFO - execute_query_save completed with response Job completed
[2022-03-01, 10:25:38 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=execute_query_save, execution_date=20220301T102453, start_date=20220301T102536, end_date=20220301T102538
[2022-03-01, 10:25:38 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:25:39 UTC] {local_task_job.py:264} INFO - 1 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:25:59 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.execute_multi_query manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:59 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.execute_multi_query manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:59 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:59 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:25:59 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:59 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryInsertJobOperatorAsync): execute_multi_query> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:25:59 UTC] {standard_task_runner.py:52} INFO - Started process 3638 to run task
[2022-03-01, 10:25:59 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'execute_multi_query', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '110', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmp3dizxx6i', '--error-file', '/tmp/tmp8voqfmdz']
[2022-03-01, 10:25:59 UTC] {standard_task_runner.py:77} INFO - Job 110: Subtask execute_multi_query
[2022-03-01, 10:25:59 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.execute_multi_query manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:26:00 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=execute_multi_query
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:26:00 UTC] {bigquery.py:160} INFO - execute_multi_query completed with response Job completed
[2022-03-01, 10:26:00 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=execute_multi_query, execution_date=20220301T102453, start_date=20220301T102559, end_date=20220301T102600
[2022-03-01, 10:26:00 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:26:00 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:27:15 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.delete_dataset manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.delete_dataset manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryDeleteDatasetOperator): delete_dataset> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:27:16 UTC] {standard_task_runner.py:52} INFO - Started process 3728 to run task
[2022-03-01, 10:27:16 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'delete_dataset', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '116', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmp3l5v67w2', '--error-file', '/tmp/tmpdu3e4p9l']
[2022-03-01, 10:27:16 UTC] {standard_task_runner.py:77} INFO - Job 116: Subtask delete_dataset
[2022-03-01, 10:27:16 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.delete_dataset manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=delete_dataset
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:27:16 UTC] {bigquery.py:1324} INFO - Dataset id: astro_dataset Project id: None
[2022-03-01, 10:27:17 UTC] {bigquery.py:526} INFO - Deleting from project: astronomer-***-providers Dataset:astro_dataset
[2022-03-01, 10:27:18 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=delete_dataset, execution_date=20220301T102453, start_date=20220301T102715, end_date=20220301T102718
[2022-03-01, 10:27:18 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:27:18 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.get_data manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.get_data manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:36 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryGetDataOperatorAsync): get_data> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:25:36 UTC] {standard_task_runner.py:52} INFO - Started process 3601 to run task
[2022-03-01, 10:25:36 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'get_data', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '103', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmprds72zc9', '--error-file', '/tmp/tmp5y0i9fb1']
[2022-03-01, 10:25:36 UTC] {standard_task_runner.py:77} INFO - Job 103: Subtask get_data
[2022-03-01, 10:25:36 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.get_data manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:25:37 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=get_data
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:25:37 UTC] {bigquery.py:317} INFO - Total extracted rows: 2
[2022-03-01, 10:25:37 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=get_data, execution_date=20220301T102453, start_date=20220301T102536, end_date=20220301T102537
[2022-03-01, 10:25:38 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:25:38 UTC] {local_task_job.py:264} INFO - 1 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:25:43 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.get_data_result manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:43 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.get_data_result manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:25:43 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:43 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:25:43 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:25:43 UTC] {taskinstance.py:1264} INFO - Executing <Task(BashOperator): get_data_result> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:25:43 UTC] {standard_task_runner.py:52} INFO - Started process 3614 to run task
[2022-03-01, 10:25:43 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'get_data_result', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '106', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmp1fo457p_', '--error-file', '/tmp/tmpq2jp0zj0']
[2022-03-01, 10:25:43 UTC] {standard_task_runner.py:77} INFO - Job 106: Subtask get_data_result
[2022-03-01, 10:25:44 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.get_data_result manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:25:45 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=get_data_result
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:25:45 UTC] {subprocess.py:62} INFO - Tmp dir root location:
/tmp
[2022-03-01, 10:25:45 UTC] {subprocess.py:74} INFO - Running command: ['bash', '-c', "echo [['42', 'fishy fish'], ['42', 'monthy python']]"]
[2022-03-01, 10:25:45 UTC] {subprocess.py:85} INFO - Output:
[2022-03-01, 10:25:45 UTC] {subprocess.py:89} INFO - [[42, fishy fish], [42, monthy python]]
[2022-03-01, 10:25:45 UTC] {subprocess.py:93} INFO - Command exited with return code 0
[2022-03-01, 10:25:45 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=get_data_result, execution_date=20220301T102453, start_date=20220301T102543, end_date=20220301T102545
[2022-03-01, 10:25:45 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:25:46 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:26:36 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.execute_long_running_query manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:26:36 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.execute_long_running_query manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:26:36 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:26:36 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:26:36 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:26:36 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryInsertJobOperatorAsync): execute_long_running_query> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:26:36 UTC] {standard_task_runner.py:52} INFO - Started process 3680 to run task
[2022-03-01, 10:26:36 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'execute_long_running_query', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '111', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmp8lk9b3zp', '--error-file', '/tmp/tmpm0kcrhyz']
[2022-03-01, 10:26:36 UTC] {standard_task_runner.py:77} INFO - Job 111: Subtask execute_long_running_query
[2022-03-01, 10:26:37 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.execute_long_running_query manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:26:37 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=execute_long_running_query
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:26:37 UTC] {bigquery.py:160} INFO - execute_long_running_query completed with response Job completed
[2022-03-01, 10:26:37 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=execute_long_running_query, execution_date=20220301T102453, start_date=20220301T102636, end_date=20220301T102637
[2022-03-01, 10:26:37 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:26:38 UTC] {local_task_job.py:264} INFO - 1 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:26:49 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.check_value manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:26:49 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.check_value manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:26:49 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:26:49 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:26:49 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:26:49 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryValueCheckOperatorAsync): check_value> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:26:49 UTC] {standard_task_runner.py:52} INFO - Started process 3692 to run task
[2022-03-01, 10:26:49 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'check_value', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '113', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmpckb99vq0', '--error-file', '/tmp/tmpcuutadsg']
[2022-03-01, 10:26:49 UTC] {standard_task_runner.py:77} INFO - Job 113: Subtask check_value
[2022-03-01, 10:26:49 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.check_value manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:26:50 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=check_value
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:26:50 UTC] {bigquery.py:456} INFO - check_value completed with response Job completed
[2022-03-01, 10:26:50 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=check_value, execution_date=20220301T102453, start_date=20220301T102649, end_date=20220301T102650
[2022-03-01, 10:26:50 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:26:50 UTC] {local_task_job.py:264} INFO - 1 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:27:09 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.check_interval manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:27:09 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.check_interval manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:27:09 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:27:09 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:27:09 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:27:10 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryIntervalCheckOperatorAsync): check_interval> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:27:10 UTC] {standard_task_runner.py:52} INFO - Started process 3714 to run task
[2022-03-01, 10:27:10 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'check_interval', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '115', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmpblx6a5fz', '--error-file', '/tmp/tmpjes5si2g']
[2022-03-01, 10:27:10 UTC] {standard_task_runner.py:77} INFO - Job 115: Subtask check_interval
[2022-03-01, 10:27:10 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.check_interval manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:27:10 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=check_interval
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:27:10 UTC] {bigquery.py:403} INFO - check_interval completed with response success
[2022-03-01, 10:27:10 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=check_interval, execution_date=20220301T102453, start_date=20220301T102709, end_date=20220301T102710
[2022-03-01, 10:27:10 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:27:11 UTC] {local_task_job.py:264} INFO - 1 downstream tasks scheduled from follow-on schedule check
[2022-03-01, 10:27:15 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.delete_dataset manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: example_async_bigquery_queries.delete_dataset manual__2022-03-01T10:24:53.297432+00:00 [queued]>
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1264} INFO - Executing <Task(BigQueryDeleteDatasetOperator): delete_dataset> on 2022-03-01 10:24:53.297432+00:00
[2022-03-01, 10:27:16 UTC] {standard_task_runner.py:52} INFO - Started process 3728 to run task
[2022-03-01, 10:27:16 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'example_async_bigquery_queries', 'delete_dataset', 'manual__2022-03-01T10:24:53.297432+00:00', '--job-id', '116', '--raw', '--subdir', 'DAGS_FOLDER/example1.py', '--cfg-path', '/tmp/tmp3l5v67w2', '--error-file', '/tmp/tmpdu3e4p9l']
[2022-03-01, 10:27:16 UTC] {standard_task_runner.py:77} INFO - Job 116: Subtask delete_dataset
[2022-03-01, 10:27:16 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: example_async_bigquery_queries.delete_dataset manual__2022-03-01T10:24:53.297432+00:00 [running]> on host 6243985b0564
[2022-03-01, 10:27:16 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=example_async_bigquery_queries
AIRFLOW_CTX_TASK_ID=delete_dataset
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T10:24:53.297432+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T10:24:53.297432+00:00
[2022-03-01, 10:27:16 UTC] {bigquery.py:1324} INFO - Dataset id: astro_dataset Project id: None
[2022-03-01, 10:27:17 UTC] {bigquery.py:526} INFO - Deleting from project: astronomer-***-providers Dataset:astro_dataset
[2022-03-01, 10:27:18 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=example_async_bigquery_queries, task_id=delete_dataset, execution_date=20220301T102453, start_date=20220301T102715, end_date=20220301T102718
[2022-03-01, 10:27:18 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 10:27:18 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
Triggerer logs
[2022-03-01 10:24:40,436] {triggerer_job.py:251} INFO - 0 triggers currently running
[2022-03-01 10:25:10,014] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:25:10,482] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_insert_query_job_2022_03_01T10_24_53_297432_00_00_cbe99ba771e5ea5c701e1aba1803ae6c, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 19) starting
[2022-03-01 10:25:10,483] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:11,592] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_insert_query_job_2022_03_01T10_24_53_297432_00_00_cbe99ba771e5ea5c701e1aba1803ae6c, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 19) fired: TriggerEvent<{'status': 'success', 'message': 'Job completed'}>
[2022-03-01 10:25:26,399] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:25:26,400] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:25:26,523] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_execute_long_running_query_2022_03_01T10_24_53_297432_00_00_e9802b7cf06b42f3786cb93bcbccf5b8, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 20) starting
[2022-03-01 10:25:26,524] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:26,532] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryGetDataTrigger conn_id=google_cloud_default, job_id=airflow_1646130324708330_c8cb047416bbc4ad89958c23acb4fe9e, dataset_id=astro_dataset, project_id=astronomer-airflow-providers, table_id=table1, poll_interval=4.0> (ID 21) starting
[2022-03-01 10:25:26,537] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:27,504] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:25:27,507] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:25:27,549] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_execute_query_save_2022_03_01T10_24_53_297432_00_00_9bc62a163a87e5d3f881ae593d617acd, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 22) starting
[2022-03-01 10:25:27,550] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:27,552] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_select_query_job_2022_03_01T10_24_53_297432_00_00_575e9bc47aab14e138795fff3ac2f8e2, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 23) starting
[2022-03-01 10:25:27,553] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:27,792] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:25:27,794] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:25:27,916] {bigquery.py:138} INFO - Executing get_job_output..
[2022-03-01 10:25:28,918] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_select_query_job_2022_03_01T10_24_53_297432_00_00_575e9bc47aab14e138795fff3ac2f8e2, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 23) fired: TriggerEvent<{'status': 'success', 'message': 'Job completed'}>
[2022-03-01 10:25:28,931] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryGetDataTrigger conn_id=google_cloud_default, job_id=airflow_1646130324708330_c8cb047416bbc4ad89958c23acb4fe9e, dataset_id=astro_dataset, project_id=astronomer-airflow-providers, table_id=table1, poll_interval=4.0> (ID 21) fired: TriggerEvent<{'status': 'success', 'message': 'success', 'records': [['42', 'fishy fish'], ['42', 'monthy python']]}>
[2022-03-01 10:25:28,969] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_execute_query_save_2022_03_01T10_24_53_297432_00_00_9bc62a163a87e5d3f881ae593d617acd, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 22) fired: TriggerEvent<{'status': 'success', 'message': 'Job completed'}>
[2022-03-01 10:25:29,664] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:25:29,665] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:25:29,666] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:25:30,560] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryGetDataTrigger conn_id=google_cloud_default, job_id=airflow_1646130324708330_c8cb047416bbc4ad89958c23acb4fe9e, dataset_id=astro_dataset, project_id=astronomer-airflow-providers, table_id=table1, poll_interval=4.0> (ID 21) starting
[2022-03-01 10:25:30,561] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:30,574] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_execute_query_save_2022_03_01T10_24_53_297432_00_00_9bc62a163a87e5d3f881ae593d617acd, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 22) starting
[2022-03-01 10:25:30,576] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:30,590] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_select_query_job_2022_03_01T10_24_53_297432_00_00_575e9bc47aab14e138795fff3ac2f8e2, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 23) starting
[2022-03-01 10:25:30,591] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:30,994] {triggerer_job.py:341} ERROR - Triggerer's async thread was blocked for 0.32 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
[2022-03-01 10:25:31,797] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:32,607] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:25:32,609] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:25:36,613] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:37,612] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:25:37,616] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:25:40,630] {triggerer_job.py:251} INFO - 1 triggers currently running
[2022-03-01 10:25:41,618] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:41,787] {triggerer_job.py:341} ERROR - Triggerer's async thread was blocked for 0.21 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
[2022-03-01 10:25:42,471] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:25:42,477] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:25:46,480] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:47,278] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:25:47,296] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:25:49,489] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:25:49,490] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:25:49,801] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_execute_multi_query_2022_03_01T10_24_53_297432_00_00_4f6d6122e138698a50131c55a2eecd2e, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 24) starting
[2022-03-01 10:25:49,803] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:49,805] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryCheckTrigger conn_id=google_cloud_default, job_id=airflow_1646130347261003_88bf3477e8845a92411f89eee2f78452, dataset_id=None, project_id=astronomer-airflow-providers, table_id=None, poll_interval=4.0> (ID 25) starting
[2022-03-01 10:25:49,810] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:51,058] {bigquery.py:138} INFO - Executing get_job_output..
[2022-03-01 10:25:51,217] {triggerer_job.py:341} ERROR - Triggerer's async thread was blocked for 0.25 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
[2022-03-01 10:25:51,223] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_execute_multi_query_2022_03_01T10_24_53_297432_00_00_4f6d6122e138698a50131c55a2eecd2e, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 24) fired: TriggerEvent<{'status': 'success', 'message': 'Job completed'}>
[2022-03-01 10:25:51,310] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:51,438] {triggerer_job.py:341} ERROR - Triggerer's async thread was blocked for 0.22 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
[2022-03-01 10:25:52,169] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryCheckTrigger conn_id=google_cloud_default, job_id=airflow_1646130347261003_88bf3477e8845a92411f89eee2f78452, dataset_id=None, project_id=astronomer-airflow-providers, table_id=None, poll_interval=4.0> (ID 25) fired: TriggerEvent<{'status': 'success', 'records': [2]}>
[2022-03-01 10:25:52,235] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:25:52,240] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:25:52,855] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:25:53,832] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryCheckTrigger conn_id=google_cloud_default, job_id=airflow_1646130347261003_88bf3477e8845a92411f89eee2f78452, dataset_id=None, project_id=astronomer-airflow-providers, table_id=None, poll_interval=4.0> (ID 25) starting
[2022-03-01 10:25:53,834] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:54,015] {triggerer_job.py:341} ERROR - Triggerer's async thread was blocked for 0.23 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
[2022-03-01 10:25:54,762] {bigquery.py:138} INFO - Executing get_job_output..
[2022-03-01 10:25:56,245] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:25:57,043] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:25:57,045] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:26:01,048] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:26:01,945] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:26:01,956] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:26:05,959] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:26:06,861] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:26:06,861] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:26:10,863] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:26:11,557] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:26:11,560] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:26:15,563] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:26:16,426] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:26:16,426] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:26:20,431] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:26:21,395] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:26:21,399] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:26:25,403] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:26:26,308] {bigquery.py:64} INFO - Query is still running...
[2022-03-01 10:26:26,309] {bigquery.py:65} INFO - Sleeping for 4.0 seconds.
[2022-03-01 10:26:30,311] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:26:32,090] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryInsertJobTrigger conn_id=google_cloud_default, job_id=airflow_example_async_bigquery_queries_execute_long_running_query_2022_03_01T10_24_53_297432_00_00_e9802b7cf06b42f3786cb93bcbccf5b8, dataset_id=None, project_id=None, table_id=None, poll_interval=4.0> (ID 20) fired: TriggerEvent<{'status': 'success', 'message': 'Job completed'}>
[2022-03-01 10:26:41,018] {triggerer_job.py:251} INFO - 0 triggers currently running
[2022-03-01 10:26:43,259] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:26:44,023] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryValueCheckTrigger conn_id=google_cloud_default, pass_value=2, job_id=airflow_1646130401654320_0d8e8b6fd8d76d336d87ea12ae8d2534, dataset_id=None, project_id=astronomer-airflow-providers, sql=SELECT COUNT(*) FROM astro_dataset.table1, table_id=None, tolerance=None, poll_interval=4.0> (ID 26) starting
[2022-03-01 10:26:44,025] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:26:45,123] {bigquery.py:138} INFO - Executing get_job_output..
[2022-03-01 10:26:45,967] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryValueCheckTrigger conn_id=google_cloud_default, pass_value=2, job_id=airflow_1646130401654320_0d8e8b6fd8d76d336d87ea12ae8d2534, dataset_id=None, project_id=astronomer-airflow-providers, sql=SELECT COUNT(*) FROM astro_dataset.table1, table_id=None, tolerance=None, poll_interval=4.0> (ID 26) fired: TriggerEvent<{'status': 'success', 'message': 'Job completed', 'records': [2]}>
[2022-03-01 10:26:46,490] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:26:47,035] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryValueCheckTrigger conn_id=google_cloud_default, pass_value=2, job_id=airflow_1646130401654320_0d8e8b6fd8d76d336d87ea12ae8d2534, dataset_id=None, project_id=astronomer-airflow-providers, sql=SELECT COUNT(*) FROM astro_dataset.table1, table_id=None, tolerance=None, poll_interval=4.0> (ID 26) starting
[2022-03-01 10:26:47,037] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:26:47,936] {bigquery.py:138} INFO - Executing get_job_output..
[2022-03-01 10:26:48,095] {triggerer_job.py:341} ERROR - Triggerer's async thread was blocked for 0.22 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
[2022-03-01 10:27:00,765] {bigquery.py:19} INFO - Using the connection google_cloud_default .
[2022-03-01 10:27:01,127] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryIntervalCheckTrigger conn_id=google_cloud_default, first_job_id=airflow_1646130416660263_365d7aeffad0f1f41b5fa6c75debd183, second_job_id=airflow_1646130418444703_cd458f6e307c0ee164eec5673fdfc75f, project_id=astronomer-airflow-providers, table=astro_dataset.table1, metrics_thresholds={'COUNT(*)': 1.5}, date_filter_column=ds, days_back=-1, ratio_formula=max_over_min, ignore_zero=True> (ID 27) starting
[2022-03-01 10:27:01,130] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:27:02,457] {bigquery.py:116} INFO - Executing get_job_status...
[2022-03-01 10:27:03,535] {bigquery.py:138} INFO - Executing get_job_output..
[2022-03-01 10:27:04,468] {bigquery.py:138} INFO - Executing get_job_output..
[2022-03-01 10:27:05,568] {bigquery.py:283} INFO - Current metric for COUNT(*): 0.0
Past metric for COUNT(*): 0.0
Ratio for COUNT(*): None
Threshold: 1.5
[2022-03-01 10:27:05,569] {bigquery.py:315} INFO - All tests have passed
[2022-03-01 10:27:05,571] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.google.cloud.triggers.bigquery.BigQueryIntervalCheckTrigger conn_id=google_cloud_default, first_job_id=airflow_1646130416660263_365d7aeffad0f1f41b5fa6c75debd183, second_job_id=airflow_1646130418444703_cd458f6e307c0ee164eec5673fdfc75f, project_id=astronomer-airflow-providers, table=astro_dataset.table1, metrics_thresholds={'COUNT(*)': 1.5}, date_filter_column=ds, days_back=-1, ratio_formula=max_over_min, ignore_zero=True> (ID 27) fired: TriggerEvent<{'status': 'success', 'message': 'Job completed', 'first_row_data': [0], 'second_row_data': [0]}>
from astronomer-providers.
CNCF Kubernetes example logs
task logs
*** Reading local file: /usr/local/airflow/logs/kpo_async/simple_async/2022-03-01T11:55:19.474039+00:00/1.log
[2022-03-01, 11:56:50 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: kpo_async.simple_async manual__2022-03-01T11:55:19.474039+00:00 [queued]>
[2022-03-01, 11:56:50 UTC] {taskinstance.py:1037} INFO - Dependencies all met for <TaskInstance: kpo_async.simple_async manual__2022-03-01T11:55:19.474039+00:00 [queued]>
[2022-03-01, 11:56:50 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 11:56:50 UTC] {taskinstance.py:1244} INFO - Starting attempt 1 of 1
[2022-03-01, 11:56:50 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-01, 11:56:50 UTC] {taskinstance.py:1264} INFO - Executing <Task(KubernetesPodOperatorAsync): simple_async> on 2022-03-01 11:55:19.474039+00:00
[2022-03-01, 11:56:50 UTC] {standard_task_runner.py:52} INFO - Started process 8217 to run task
[2022-03-01, 11:56:50 UTC] {standard_task_runner.py:76} INFO - Running: ['***', 'tasks', 'run', 'kpo_async', 'simple_async', 'manual__2022-03-01T11:55:19.474039+00:00', '--job-id', '195', '--raw', '--subdir', 'DAGS_FOLDER/example.py', '--cfg-path', '/tmp/tmpoe64pw1k', '--error-file', '/tmp/tmpjpivk2yy']
[2022-03-01, 11:56:50 UTC] {standard_task_runner.py:77} INFO - Job 195: Subtask simple_async
[2022-03-01, 11:56:50 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: kpo_async.simple_async manual__2022-03-01T11:55:19.474039+00:00 [running]> on host 6243985b0564
[2022-03-01, 11:56:50 UTC] {taskinstance.py:1429} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_OWNER=***
AIRFLOW_CTX_DAG_ID=kpo_async
AIRFLOW_CTX_TASK_ID=simple_async
AIRFLOW_CTX_EXECUTION_DATE=2022-03-01T11:55:19.474039+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2022-03-01T11:55:19.474039+00:00
[2022-03-01, 11:56:50 UTC] {kubernetes_pod.py:524} INFO - Creating pod simple-async.99ae921a74a241f999cd1ad77bf35d68 with labels: {'dag_id': 'kpo_async', 'task_id': 'simple_async', 'execution_date': '2022-03-01T115519.4740390000-3f6b6093a', 'try_number': '1'}
[2022-03-01, 11:56:50 UTC] {kubernetes_pod.py:335} INFO - Found matching pod simple-async.cdfca01c1ead40b5b5f62f106dbe35e4 with labels {'airflow_version': '2.2.4', 'dag_id': 'kpo_async', 'execution_date': '2022-03-01T115519.4740390000-3f6b6093a', 'kubernetes_pod_operator': 'True', 'task_id': 'simple_async', 'try_number': '1'}
[2022-03-01, 11:56:50 UTC] {kubernetes_pod.py:336} INFO - `try_number` of task_instance: 1
[2022-03-01, 11:56:50 UTC] {kubernetes_pod.py:337} INFO - `try_number` of pod: 1
[2022-03-01, 11:56:51 UTC] {kubernetes_pod.py:417} INFO - Deleting pod: simple-async.cdfca01c1ead40b5b5f62f106dbe35e4
[2022-03-01, 11:56:51 UTC] {taskinstance.py:1272} INFO - Marking task as SUCCESS. dag_id=kpo_async, task_id=simple_async, execution_date=20220301T115519, start_date=20220301T115650, end_date=20220301T115651
[2022-03-01, 11:56:51 UTC] {local_task_job.py:154} INFO - Task exited with return code 0
[2022-03-01, 11:56:51 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
Triggerer logs
[2022-03-01 11:54:59,293] {triggerer_job.py:101} INFO - Starting the triggerer
[2022-03-01 11:55:24,638] {triggerer_job.py:341} ERROR - Triggerer's async thread was blocked for 0.21 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
[2022-03-01 11:55:25,339] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.cncf.kubernetes.triggers.wait_container.WaitContainerTrigger kubernetes_conn_id=None, hook_params={'cluster_context': None, 'config_file': '/usr/local/airflow/dags/config', 'in_cluster': False}, pod_name=simple-async.084f913960024139b0bf543372a9da07, container_name=base, pod_namespace=default, pending_phase_timeout=120, poll_interval=5> (ID 35) starting
[2022-03-01 11:55:25,460] {base_events.py:1738} ERROR - Unclosed client session
client_session: <aiohttp.client.ClientSession object at 0x4030d129a0>
[2022-03-01 11:55:25,476] {triggerer_job.py:341} ERROR - Triggerer's async thread was blocked for 0.23 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
[2022-03-01 11:55:59,536] {triggerer_job.py:251} INFO - 1 triggers currently running
[2022-03-01 11:56:01,322] {triggerer_job.py:359} INFO - Trigger <astronomer.providers.cncf.kubernetes.triggers.wait_container.WaitContainerTrigger kubernetes_conn_id=None, hook_params={'cluster_context': None, 'config_file': '/usr/local/airflow/dags/config', 'in_cluster': False}, pod_name=simple-async.084f913960024139b0bf543372a9da07, container_name=base, pod_namespace=default, pending_phase_timeout=120, poll_interval=5> (ID 35) fired: TriggerEvent<{'status': 'done'}>
[2022-03-01 11:56:02,549] {triggerer_job.py:356} INFO - Trigger <astronomer.providers.cncf.kubernetes.triggers.wait_container.WaitContainerTrigger kubernetes_conn_id=None, hook_params={'cluster_context': None, 'config_file': '/usr/local/airflow/dags/config', 'in_cluster': False}, pod_name=simple-async.084f913960024139b0bf543372a9da07, container_name=base, pod_namespace=default, pending_phase_timeout=120, poll_interval=5> (ID 35) starting
[2022-03-01 11:56:02,718] {base_events.py:1738} ERROR - Unclosed client session
from astronomer-providers.
@rajaths010494 to check the latest on the mypy-strict branch and post the update here
from astronomer-providers.
astronomer/providers/core/triggers/filesystem.py: note: In member "__init__" of class "FileTrigger":
astronomer/providers/core/triggers/filesystem.py:31: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer/providers/core/triggers/filesystem.py: note: In member "run" of class "FileTrigger":
astronomer/providers/core/triggers/filesystem.py:49: error: The return type of an async generator function should be "AsyncGenerator" or one of its supertypes [misc]
async def run(self) -> None:
^
astronomer/providers/core/triggers/filesystem.py:49: error: Return type "None" of "run" incompatible with return type "Coroutine[Any, Any, AsyncIterator[TriggerEvent]]" in supertype "BaseTrigger" [override]
async def run(self) -> None:
^
astronomer/providers/core/triggers/external_task.py:5: error: Module "airflow.models" does not explicitly export attribute "DagRun"; implicit reexport disabled [attr-defined]
from airflow.models import DagRun, TaskInstance
^
astronomer/providers/core/triggers/external_task.py:5: error: Module "airflow.models" does not explicitly export attribute "TaskInstance"; implicit reexport disabled [attr-defined]
from airflow.models import DagRun, TaskInstance
^
astronomer/providers/core/triggers/external_task.py: note: In member "__init__" of class "TaskStateTrigger":
astronomer/providers/core/triggers/external_task.py:21: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer/providers/core/triggers/external_task.py: note: In member "run" of class "TaskStateTrigger":
astronomer/providers/core/triggers/external_task.py:43: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer/providers/core/triggers/external_task.py: note: In member "count_tasks" of class "TaskStateTrigger":
astronomer/providers/core/triggers/external_task.py:57: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def count_tasks(self, session) -> int:
^
astronomer/providers/core/triggers/external_task.py:71: error: Returning Any from function declared to return "int" [no-any-return]
return count
^
astronomer/providers/core/triggers/external_task.py: note: In member "__init__" of class "DagStateTrigger":
astronomer/providers/core/triggers/external_task.py:82: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer/providers/core/triggers/external_task.py: note: In member "run" of class "DagStateTrigger":
astronomer/providers/core/triggers/external_task.py:102: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer/providers/core/triggers/external_task.py: note: In member "count_dags" of class "DagStateTrigger":
astronomer/providers/core/triggers/external_task.py:117: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def count_dags(self, session) -> int:
^
astronomer/providers/core/triggers/external_task.py:130: error: Returning Any from function declared to return "int" [no-any-return]
return count
^
astronomer/providers/core/sensors/external_task.py: note: In member "execute" of class "ExternalTaskSensorAsync":
astronomer/providers/core/sensors/external_task.py:12: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer/providers/core/sensors/external_task.py:17: error: Call to untyped function "get_execution_dates" in typed context [no-untyped-call]
execution_dates = self.get_execution_dates(context)
^
astronomer/providers/core/sensors/external_task.py: note: In member "execute_complete" of class "ExternalTaskSensorAsync":
astronomer/providers/core/sensors/external_task.py:47: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, session, event=None): # pylint: disable=unused-argument
^
astronomer/providers/core/sensors/external_task.py:52: error: Call to untyped function "get_execution_dates" in typed context [no-untyped-call]
execution_dates = self.get_execution_dates(context)
^
astronomer/providers/core/sensors/external_task.py: note: In member "get_execution_dates" of class "ExternalTaskSensorAsync":
astronomer/providers/core/sensors/external_task.py:63: error: Function is missing a type annotation [no-untyped-def]
def get_execution_dates(self, context):
^
astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:1: error: Unused "type: ignore" comment
import aiofiles # type: ignore[import]
^
astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py: note: In member "_load_config" of class "KubernetesHookAsync":
astronomer/providers/cncf/kubernetes/hooks/kubernetes_async.py:53: error: Module has no attribute "tempfile" [attr-defined]
async with aiofiles.tempfile.NamedTemporaryFile() as temp_config:
^
astronomer/providers/amazon/aws/hooks/redshift_data.py: note: In member "get_conn_params" of class "RedshiftDataHook":
astronomer/providers/amazon/aws/hooks/redshift_data.py:20: error: Argument 1 to "get_connection" of "BaseHook" has incompatible type "Optional[str]"; expected "str" [arg-type]
connection_object = self.get_connection(self.aws_conn_id)
^
astronomer/providers/amazon/aws/hooks/redshift_data.py: note: In member "execute_query" of class "RedshiftDataHook":
astronomer/providers/amazon/aws/hooks/redshift_data.py:67: error: Function is missing a return type annotation [no-untyped-def]
def execute_query(self, sql: Union[str, Iterable[str]], params: Optional[Dict]):
^
astronomer/providers/amazon/aws/hooks/redshift_data.py:67: error: Missing type parameters for generic type "Dict" [type-arg]
def execute_query(self, sql: Union[str, Iterable[str]], params: Optional[Dict]):
^
astronomer/providers/snowflake/hooks/snowflake.py: note: In member "run" of class "SnowflakeHookAsync":
astronomer/providers/snowflake/hooks/snowflake.py:14: error: Function is missing a return type annotation [no-untyped-def]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer/providers/snowflake/hooks/snowflake.py:14: error: Signature of "run" incompatible with supertype "SnowflakeHook" [override]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer/providers/snowflake/hooks/snowflake.py:14: note: Superclass:
astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Union[Dict[Any, Any], Iterable[Any], None] = ..., handler: Optional[Callable[..., Any]] = ...) -> Any
astronomer/providers/snowflake/hooks/snowflake.py:14: note: Subclass:
astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Optional[Dict[Any, Any]] = ...) -> Any
astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Union[Dict[Any, Any], Iterable[Any], None] = ..., handler: Optional[Callable[..., Any]] = ...) -> Any
astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Optional[Dict[Any, Any]] = ...) -> Any
astronomer/providers/snowflake/hooks/snowflake.py:14: error: Signature of "run" incompatible with supertype "DbApiHook" [override]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Any, autocommit: Any = ..., parameters: Any = ..., handler: Any = ...) -> Any
astronomer/providers/snowflake/hooks/snowflake.py:14: note: def run(self, sql: Union[str, List[Any]], autocommit: bool = ..., parameters: Optional[Dict[Any, Any]] = ...) -> Any
astronomer/providers/snowflake/hooks/snowflake.py:14: error: Missing type parameters for generic type "list" [type-arg]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer/providers/snowflake/hooks/snowflake.py:14: error: Missing type parameters for generic type "dict" [type-arg]
def run(self, sql: Union[str, list], autocommit: bool = False, parameters: Optional[dict] = None):
^
astronomer/providers/snowflake/hooks/snowflake.py:56: error: Call to untyped function "get_autocommit" in typed context [no-untyped-call]
if not self.get_autocommit(conn):
^
astronomer/providers/snowflake/hooks/snowflake.py: note: In member "check_query_output" of class "SnowflakeHookAsync":
astronomer/providers/snowflake/hooks/snowflake.py:60: error: Function is missing a return type annotation [no-untyped-def]
def check_query_output(self, query_ids: List[str]):
^
astronomer/providers/snowflake/hooks/snowflake.py: note: In member "get_query_status" of class "SnowflakeHookAsync":
astronomer/providers/snowflake/hooks/snowflake.py:72: error: Function is missing a return type annotation [no-untyped-def]
async def get_query_status(self, query_ids: List[str]):
^
astronomer/providers/core/sensors/filesystem.py: note: In member "execute" of class "FileSensorAsync":
astronomer/providers/core/sensors/filesystem.py:29: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer/providers/core/sensors/filesystem.py:30: error: Call to untyped function "poke" in typed context [no-untyped-call]
if not self.poke(context=context):
^
astronomer/providers/core/sensors/filesystem.py:31: error: Call to untyped function "FSHook" in typed context [no-untyped-call]
hook = FSHook(self.fs_conn_id)
^
astronomer/providers/core/sensors/filesystem.py: note: In member "execute_complete" of class "FileSensorAsync":
astronomer/providers/core/sensors/filesystem.py:46: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer/providers/core/example_dags/example_external_task_wait_for_me.py:11: error: Call to untyped function "datetime" in typed context [no-untyped-call]
start_date=datetime(2022, 1, 1),
^
astronomer/providers/core/example_dags/example_external_task_wait_for_me.py:13: error: Call to untyped function "TimeSensorAsync" in typed context [no-untyped-call]
wait_for_me = TimeSensorAsync(
^
astronomer/providers/core/example_dags/example_external_task.py:18: error: Call to untyped function "datetime" in typed context [no-untyped-call]
with DAG("test_external_task_async", schedule_interval="@daily", start_date=datetime(2022, 1, 1)) as dag:
^
astronomer/providers/cncf/kubernetes/triggers/wait_container.py: note: In member "__init__" of class "WaitContainerTrigger":
astronomer/providers/cncf/kubernetes/triggers/wait_container.py:50: error: Incompatible default for argument "container_name" (default has type "None", argument has type "str") [assignment]
container_name: str = None,
^
astronomer/providers/snowflake/triggers/snowflake_trigger.py: note: In function "get_db_hook":
astronomer/providers/snowflake/triggers/snowflake_trigger.py:10: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def get_db_hook(self) -> SnowflakeHookAsync:
^
astronomer/providers/snowflake/triggers/snowflake_trigger.py: note: In member "__init__" of class "SnowflakeTrigger":
astronomer/providers/snowflake/triggers/snowflake_trigger.py:29: error: Call to untyped function "__init__" in typed context [no-untyped-call]
super().__init__()
^
astronomer/providers/snowflake/triggers/snowflake_trigger.py: note: In member "run" of class "SnowflakeTrigger":
astronomer/providers/snowflake/triggers/snowflake_trigger.py:49: error: Function is missing a return type annotation [no-untyped-def]
async def run(self):
^
astronomer/providers/http/example_dags/example_http.py:6: error: Call to untyped function "datetime" in typed context [no-untyped-call]
with DAG("example_async_http_sensor", tags=["example", "async"], start_date=datetime(2022, 1, 1)) as dag:
^
astronomer/providers/core/example_dags/example_file_sensor.py:8: error: Call to untyped function "datetime" in typed context [no-untyped-call]
start_date=datetime(2022, 1, 1),
^
astronomer/providers/core/example_dags/example_file_sensor.py:12: error: Call to untyped function "FileSensorAsync" in typed context [no-untyped-call]
sensor_task = FileSensorAsync(
^
astronomer/providers/snowflake/operators/snowflake.py: note: In member "__init__" of class "SnowflakeOperatorAsync":
astronomer/providers/snowflake/operators/snowflake.py:12: error: Function is missing a type annotation for one or more arguments [no-untyped-def]
def __init__(self, *, poll_interval: int = 5, **kwargs):
^
astronomer/providers/snowflake/operators/snowflake.py: note: In member "execute" of class "SnowflakeOperatorAsync":
astronomer/providers/snowflake/operators/snowflake.py:20: error: Function is missing a type annotation [no-untyped-def]
def execute(self, context):
^
astronomer/providers/snowflake/operators/snowflake.py: note: In member "execute_complete" of class "SnowflakeOperatorAsync":
astronomer/providers/snowflake/operators/snowflake.py:44: error: Function is missing a type annotation [no-untyped-def]
def execute_complete(self, context, event=None): # pylint: disable=unused-argument
^
astronomer/providers/databricks/example_dags/example_databricks.py:24: error: Call to untyped function "datetime" in typed context [no-untyped-call]
start_date=datetime(2022, 1, 1),
^
astronomer/providers/databricks/example_dags/example_databricks.py:41: error: Argument "job_id" to "DatabricksRunNowOperatorAsync" has incompatible type "int"; expected "Optional[str]" [arg-type]
job_id=1003,
^
astronomer/providers/amazon/aws/operators/redshift_sql.py: note: In member "execute" of class "RedshiftSQLOperatorAsync":
astronomer/providers/amazon/aws/operators/redshift_sql.py:29: error: Argument "sql" to "execute_query" of "RedshiftDataHook" has incompatible type "Union[Dict[Any, Any], Iterable[Any], None]"; expected "Union[str, Iterable[str]]" [arg-type]
query_ids = redshift_data_hook.execute_query(sql=self.sql, params=self.params)
^
astronomer/providers/snowflake/example_dags/example_snowflake.py:28: error: Call to untyped function "datetime" in typed context [no-untyped-call]
start_date=datetime(2022, 1, 1),
^
astronomer/providers/google/cloud/hooks/gcs.py: note: In member "get_storage_client" of class "GCSHookAsync":
astronomer/providers/google/cloud/hooks/gcs.py:20: error: Argument "session" to "Storage" has incompatible type "ClientSession"; expected "Optional[Session]" [arg-type]
return Storage(service_file=file, session=session)
^
astronomer/providers/amazon/aws/example_dags/example_s3.py:16: error: Call to untyped function "datetime" in typed context [no-untyped-call]
start_date=datetime(2021, 1, 1),
^
astronomer/providers/amazon/aws/example_dags/example_redshift_sql.py:10: error: Call to untyped function "datetime" in typed context [no-untyped-call]
start_date=datetime(2021, 1, 1),
^
astronomer/providers/amazon/aws/example_dags/example_redshift_cluster_management.py:19: error: Call to untyped function "datetime" in typed context [no-untyped-call]
start_date=datetime(2021, 1, 1),
^
astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "get_job_instance" of class "BigQueryHookAsync":
astronomer/providers/google/cloud/hooks/bigquery.py:88: error: Argument "session" to "Job" has incompatible type "ClientSession"; expected "Optional[Session]" [arg-type]
return Job(job_id=job_id, project=project_id, service_file=f, session=session)
^
astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "get_job_status" of class "BigQueryHookAsync":
astronomer/providers/google/cloud/hooks/bigquery.py:102: error: Argument 1 to "result" of "Job" has incompatible type "ClientSession"; expected "Optional[Session]" [arg-type]
job_status_response = await job_client.result(s)
^
astronomer/providers/google/cloud/hooks/bigquery.py: note: In member "get_job_output" of class "BigQueryHookAsync":
astronomer/providers/google/cloud/hooks/bigquery.py:124: error: Argument 1 to "get_query_results" of "Job" has incompatible type "ClientSession"; expected "Optional[Session]" [arg-type]
job_query_response = await job_client.get_query_results(session)
^
astronomer/providers/google/cloud/example_dags/example_gcs.py:27: error: Module has no attribute "DAG" [attr-defined]
with models.DAG(
^
astronomer/providers/google/cloud/example_dags/example_bigquery_queries.py:47: error: Module has no attribute "DAG" [attr-defined]
with models.DAG(
^
Found 61 errors in 24 files (checked 88 source files)
from astronomer-providers.
Fixed mypy errors for amazon providers and merged the changes to amazon-s3-operators
branch
from astronomer-providers.
Related Issues (20)
- Deprecate core async sensors HOT 4
- Deprecate dbt async provider HOT 1
- Deprecate snowflake async provider HOT 4
- Deprecate databricks async provider HOT 2
- Drepcreate hive provider HOT 1
- Deprecate kubernetes async provider HOT 5
- Deprecate sftp async provider HOT 1
- Deprecate http async provider HOT 3
- Deprecate livy async provider HOT 1
- Fix broken "Test providers RC releases" action HOT 2
- task "resume_redshift_cluster" in DAG "example_async_redshift_cluster_management" failed HOT 1
- Update public documentation regarding deprecation of operators HOT 2
- Fix example_external_deployment_task_sensor DAG HOT 3
- Release astronomer-providers 1.19.0 HOT 1
- Test unpinning of pendulum to 2.1.2
- Deploy job failing for astronomer-providers
- Invalid dependency graph for tasks HOT 1
- fix unit test and static checks failure on auto-created PRs
- Deploy failed due to the latest wave provider release HOT 3
- Release astronomer-providers 1.19.1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from astronomer-providers.