Comments (6)
Hi @Treasure-u , do you mean both resource files and workflow created success, but when you run the workflow and go to see the workflow/task instance log is show's host is empty?
If in this situation, could you
- check your task definition and see whether resource in your task looking good?
- check your master and worker and see whether they exists and working fun?
- check your master or worker log to see whether have detail error log?
from dolphinscheduler-sdk-python.
duplicate #101
from dolphinscheduler-sdk-python.
Infact I have been trying this example and faced the same error.
- check your task definition and see whether resource in your task looking good? - Resources are not added into the Task Definition.
- check your master and worker and see whether they exists and working fun? - Yes both are working fine
- check your master or worker log to see whether have detail error log? - error from the master log
[ERROR] 2023-09-22 18:14:52.460 +0000 TaskLogLogger-class org.apache.dolphinscheduler.server.master.runner.task.CommonTaskProcessor:[128] - [WorkflowInstance-109][TaskInstance-148] - Task use-resource is submitted to priority queue error java.lang.NullPointerException: null at org.apache.dolphinscheduler.service.process.ProcessServiceImpl.queryTenantCodeByResName(ProcessServiceImpl.java:2151) at org.apache.dolphinscheduler.service.process.ProcessServiceImpl$$FastClassBySpringCGLIB$$9d3e18f9.invoke(<generated>) at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:793) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:763) at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:97) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:763) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:708) at org.apache.dolphinscheduler.service.process.ProcessServiceImpl$$EnhancerBySpringCGLIB$$f5037904.queryTenantCodeByResName(<generated>) at org.apache.dolphinscheduler.server.master.runner.task.BaseTaskProcessor.lambda$getResourceFullNames$4(BaseTaskProcessor.java:629) at java.base/java.lang.Iterable.forEach(Iterable.java:75) at org.apache.dolphinscheduler.server.master.runner.task.BaseTaskProcessor.getResourceFullNames(BaseTaskProcessor.java:628) at org.apache.dolphinscheduler.server.master.runner.task.BaseTaskProcessor.getTaskExecutionContext(BaseTaskProcessor.java:316) at org.apache.dolphinscheduler.server.master.runner.task.CommonTaskProcessor.dispatchTask(CommonTaskProcessor.java:116) at org.apache.dolphinscheduler.server.master.runner.task.BaseTaskProcessor.dispatch(BaseTaskProcessor.java:241) at org.apache.dolphinscheduler.server.master.runner.task.BaseTaskProcessor.action(BaseTaskProcessor.java:212) at org.apache.dolphinscheduler.server.master.runner.WorkflowExecuteRunnable.submitTaskExec(WorkflowExecuteRunnable.java:990) at org.apache.dolphinscheduler.server.master.runner.WorkflowExecuteRunnable.submitStandByTask(WorkflowExecuteRunnable.java:1845) at org.apache.dolphinscheduler.server.master.runner.WorkflowExecuteRunnable.submitPostNode(WorkflowExecuteRunnable.java:1367) at org.apache.dolphinscheduler.server.master.runner.WorkflowExecuteRunnable.call(WorkflowExecuteRunnable.java:703) at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1771) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:830)
from dolphinscheduler-sdk-python.
YES, I have the same problem while using the pydolphinscheduler CLI to create workflow with YAML file, pydolphinscheduler's version im using is 4.0.3, and the version of dolphinscheduler is 3.1.7, a part of whole code as following:
`
- name: child2
task_type: Shell
command: sh test1.sh
fail_retry_times: 2
fail_retry_interval: 3
worker_group: worker252
environment_name: worker-[252]-environment
resource_list: [test1.sh]
deps:- parent
`
I'm used these code to generate workflow and the programe raise an error like below:
- parent
Auth token is default token, highly recommend add a token in production, especially you deploy in public network. /home/hadoop/.local/lib/python3.8/site-packages/pydolphinscheduler/java_gateway.py:324: UserWarning: Using unmatched version of pydolphinscheduler (version 4.0.3) and Java gateway (version 3.1.7) may cause errors. We strongly recommend you to find the matched version (check: https://pypi.org/project/apache-dolphinscheduler) gateway = GatewayEntryPoint() Traceback (most recent call last): File "/home/hadoop/.local/bin/pydolphinscheduler", line 8, in <module> sys.exit(cli()) File "/home/hadoop/.local/lib/python3.8/site-packages/click/core.py", line 1157, in __call__ return self.main(*args, **kwargs) File "/home/hadoop/.local/lib/python3.8/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) File "/home/hadoop/.local/lib/python3.8/site-packages/click/core.py", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/hadoop/.local/lib/python3.8/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) File "/home/hadoop/.local/lib/python3.8/site-packages/click/core.py", line 783, in invoke return __callback(*args, **kwargs) File "/home/hadoop/.local/lib/python3.8/site-packages/pydolphinscheduler/cli/commands.py", line 106, in yaml create_workflow(yaml_file) File "/home/hadoop/.local/lib/python3.8/site-packages/pydolphinscheduler/core/yaml_workflow.py", line 494, in create_workflow YamlWorkflow.parse(yaml_file) File "/home/hadoop/.local/lib/python3.8/site-packages/pydolphinscheduler/core/yaml_workflow.py", line 235, in parse workflow_name = cls(yaml_file).create_workflow() File "/home/hadoop/.local/lib/python3.8/site-packages/pydolphinscheduler/core/yaml_workflow.py", line 179, in create_workflow task = self.parse_task(task_data, name2task) File "/home/hadoop/.local/lib/python3.8/site-packages/pydolphinscheduler/core/yaml_workflow.py", line 290, in parse_task task = task_cls(**task_params) File "/home/hadoop/.local/lib/python3.8/site-packages/pydolphinscheduler/tasks/shell.py", line 58, in __init__ super().__init__(name, TaskType.SHELL, *args, **kwargs) File "/home/hadoop/.local/lib/python3.8/site-packages/pydolphinscheduler/core/task.py", line 222, in __init__ self.get_content() File "/home/hadoop/.local/lib/python3.8/site-packages/pydolphinscheduler/core/task.py", line 332, in get_content res = self.get_plugin() File "/home/hadoop/.local/lib/python3.8/site-packages/pydolphinscheduler/core/task.py", line 319, in get_plugin raise PyResPluginException( pydolphinscheduler.exceptions.PyResPluginException: The execution command of this task is a file, but the resource plugin is empty
the error remaind me to add resource plugin in YMAL file, resource center build on HDFS, after checking the source code and official doc, pydolphinscheduler support Local, GitHub, GitLab, OSS, S3, but not contain HDFS, so i should create a new resource plugin with HDFS? or do you have some other way can help me to create a workflow which can using resource center file with YAML file
Looking forward to your reply
from dolphinscheduler-sdk-python.
maybe you can try my fixed dolphinscheduler-sdk https://github.com/Treasure-u/dolphinscheduler-sdk-python
from dolphinscheduler-sdk-python.
hi @Treasure-u @baratamavinash225 @Andy-xu-007 this issue already fix in #116 and we will release in version 4.0.4 within 3 days, thanks for your bug reports
from dolphinscheduler-sdk-python.
Related Issues (20)
- Problems about how to view all workflows in single project HOT 1
- task instance is null or host is null HOT 4
- Set global variables HOT 2
- Inconsistency between PyDolphinScheduler document and dolphinscheduler web HOT 1
- Cannot find place to set task group id for tasks HOT 3
- I failed to execute the sql sample. How do I add delimiters for non-query sql HOT 5
- Error running Python task: "/opt/soft/python: No such file or directory" HOT 1
- feat: Add basic log when execute dag file
- dolphinscheduler 3.2.0 support HOT 12
- How to set Startup Parameter? HOT 5
- How to apply variables in clickhouse sql Task db HOT 7
- The timeout types of workflow and task parameters are inconsistent HOT 1
- Task `timeout_notify_strategy` param not work HOT 2
- py4j.protocol.Py4JJavaError: An error occurred while calling t.createOrUpdateWorkflow. HOT 1
- datax json template display on one line in UI HOT 1
- Support for Chinese Documentation HOT 2
- Bad config name in configuration.py HOT 1
- Support Flink SQL & Flink Stream task HOT 1
- When will pyds support ds v3.1.7 and when will seatunnel be supported HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from dolphinscheduler-sdk-python.