Code Monkey home page Code Monkey logo

rgthree-comfy's People

Contributors

dotjack avatar drjkl avatar mcmonkey4eva avatar receyuki avatar rgthree avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rgthree-comfy's Issues

maximum recursion depth exceeded

When I want to use OpenPose after an Upscale, I get this error, but if I take the image before the Upscale, I don't get any error.

File "threading.py", line 1016, in _bootstrap_inner File "threading.py", line 953, in run File "D:\ComfyUI_windows_portable\ComfyUI\main.py", line 95, in prompt_worker e.execute(item[2], prompt_id, item[3], item[4]) File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy\__init__.py", line 173, in rgthree_execute return self.old_execute(*args, **kwargs) File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 371, in execute to_execute = sorted(list(map(lambda a: (len(recursive_will_execute(prompt, self.outputs, a[-1])), a[-1]), to_execute))) File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 371, in <lambda> to_execute = sorted(list(map(lambda a: (len(recursive_will_execute(prompt, self.outputs, a[-1])), a[-1]), to_execute))) File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy\__init__.py", line 195, in rgthree_recursive_will_execute will_execute_value = execution.recursive_will_execute(prompt, outputs, input_unique_id, *args, **kwargs) File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy\__init__.py", line 195, in rgthree_recursive_will_execute will_execute_value = execution.recursive_will_execute(prompt, outputs, input_unique_id, *args, **kwargs) File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy\__init__.py", line 195, in rgthree_recursive_will_execute will_execute_value = execution.recursive_will_execute(prompt, outputs, input_unique_id, *args, **kwargs) [Previous line repeated 988 more times] File "D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy\__init__.py", line 179, in rgthree_recursive_will_execute will_execute = RgthreePatchRecursiveExecute_Set_patch_recursive_execution_to_false_if_not_working(unique_id) RecursionError: maximum recursion depth exceeded

Here's the link to my workflow : https://drive.google.com/drive/folders/1-Gt4BCxEeBnFwUrVlITmr6SLwAtIarsG?usp=sharing

image

missing nodes in new installation

Hi, I am very new to your extension and comfyui.
Just install it to use your seed node, but I can not find it.
I only get Reroute, Node Combiner, Fast Bypasser and Fast Muter in the node menu.

Where are Seed, Lora Loader Stack and Context?
When will be Power Prompt realese?

I would suggest for Reroute node shows a fast switch to change the direction, instead of doing it from the node menu.

IMPORT FAILED?

Just downloaded your plugins for Comfy, I'm still new to comfy btw, and followed your instructions of the simple git pull but the import failed... here is what it shows in the console

Error:
[WinError 1314] A required privilege is not held by the client: 'F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyLiterals\js' -> 'F:\ComfyUI_windows_portable\ComfyUI\web\extensions\ComfyLiterals'
Failed to create symlink to F:\ComfyUI_windows_portable\ComfyUI\web\extensions\ComfyLiterals. Please copy the folder manually.
Source: F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyLiterals\js
Target: F:\ComfyUI_windows_portable\ComfyUI\web\extensions\ComfyLiterals

Loading: ComfyUI-Manager (V0.30.5)

ComfyUI Revision: 1487 [2381d36e] | Released on '2023-09-25'

INFO:comfyui-prompt-control:Use STYLE:weight_interpretation:normalization at the start of a prompt to use advanced encodings
INFO:comfyui-prompt-control:Weight interpretations available: comfy,A1111,compel,comfy++,down_weight,perp
INFO:comfyui-prompt-control:Normalization types available: none,mean,length,length+mean
Traceback (most recent call last):
File "F:\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1725, in load_custom_node
module_spec.loader.exec_module(module)
File "", line 883, in exec_module
File "", line 241, in call_with_frames_removed
File "F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy_init
.py", line 92, in
rgthree_config_user = json.load(file)
File "json_init_.py", line 293, in load
File "json_init_.py", line 346, in loads
File "json\decoder.py", line 337, in decode
File "json\decoder.py", line 355, in raw_decode
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Cannot import F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy module for custom nodes: Expecting value: line 1 column 1 (char 0)

it says to copy the ComfyLiterals folder Manually? but idk what it means by that tbh cuz all I did was git pull... do I just re-download that folder from the git hub and put it in there or...?

Odd error and reroute incompatabilities.

Getting an inability to plug rg3 reroutes into normal reroutes, makes converting old workflows over really fiddly.

Also this bug just happened on loading a workflow and it dissapoints me that I might have to ditch rg3 just to run my workflow :<
Screenshot 2023-08-24 014201

Consideration - Fast Actions Button - Is the 'action' button necessary?

Is this button really necessary to confirm the choices?
Indeed, I noticed that when the workflow loads, I have to start by pressing this button to validate the displayed choices. If I don't do that, the actions don't proceed as displayed.
Furthermore, I often forget (my fault) to confirm my choices because, for me, it is logical that when I see 'enable,' for example, the action should be triggered without the need to validate it.

So my question is: could we do without this button?

Context Switch not picking the first non-null value

Or better, I probably didn't understand well enough how it should work.

I have the following situation, where 4 input images have to arrive to an upscaler. They come from 4 different functions of the workflow.

At any given, only one of the 4 inputs is really carrying an image.

I can't solve this problem with ordinary switches as, I'm told, ComfyUI will not work if inputs are empty. So, I thought your Context Switch would work as "automatically choose the first non-null context to continue onward with".

Hence the configuration:

Screenshot 2023-09-02 at 10 33 55

Now, let's say that the only non-null image in my case is input 2, which carries the Refiner image. I'd expect that to be selected and passed to the Context Switch. Instead, I receive this error:

Traceback (most recent call last):
  File "/Users/xyz/Desktop/AI/Tools/ComfyUI/execution.py", line 151, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/xyz/Desktop/AI/Tools/ComfyUI/execution.py", line 81, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/xyz/Desktop/AI/Tools/ComfyUI/execution.py", line 74, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/xyz/Desktop/AI/Tools/ComfyUI/comfy_extras/nodes_upscale_model.py", line 39, in upscale
    in_img = image.movedim(-1,-3).to(device)
             ^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'movedim'

I also attach the whole workflow for more context:

Context Switch Test

rgthree-comfy module for custom nodes: glob() got an unexpected keyword argument 'root_dir'

Not sure why this is happening, it may be after I installed mtb but I installed a few so it could be anything.. but I wanted to let you know this error is being thrown.

[comfy_mtb] | INFO -> loaded 52 nodes successfuly
[comfy_mtb] | INFO -> Some nodes (3) could not be loaded. This can be ignored, but go to http://none:6006/mtb if you want more information.
Traceback (most recent call last):
File "/notebooks/ComfyUI/nodes.py", line 1735, in load_custom_node
module_spec.loader.exec_module(module)
File "", line 850, in exec_module
File "", line 228, in _call_with_frames_removed
File "/notebooks/ComfyUI/custom_nodes/rgthree-comfy/init.py", line 98, in
for file in glob.glob('.py', root_dir=DIR_PY) + glob.glob('.js', root_dir=os.path.join(DIR_DEV_WEB, 'js')):
TypeError: glob() got an unexpected keyword argument 'root_dir'

Cannot import /notebooks/ComfyUI/custom_nodes/rgthree-comfy module for custom nodes: glob() got an unexpected keyword argument 'root_dir'

I was able to fix this by replacing this line

with this

for file in glob.glob('*.py', root_dir=DIR_PY) + glob.glob('*.js', root_dir=os.path.join(DIR_DEV_WEB, 'js')):

for file in glob.glob(os.path.join(DIR_PY, '.py')) + glob.glob(os.path.join(DIR_DEV_WEB, 'js', '.js')):

not tested on windows..

Popup Image for LoRAs (in LoRA Stacker)

Hey all,

I have a carp load of LoRAs and sometimes their function escapes me. I was hoping to add a hover-over option when navigating through them in the LoRA Stacker. Hoping I could get help with that (or if that's something RG3 wanted to tackle/add for everyone (=

I tried to figure this out myself, but I'm not an expert coder, no experience with Python...not sure how that plays into things, if at all. Had looked through the html, TS and js files. Particularly the Constants.x files โ€“ couldn't find any HTML/js specific to the LoRA Stacker.

Would just copy pasta some "Image Popup" code from the interweb but I couldn't find any specific code with regards to the LoRA Nav. Menu. Figured 1 could just throw in a .jpg with the name attached to the LoRA file to keep things organized/easy.

Thanks for any help! (And thanks for your amazing work RG3, you're a legend, even if your QB counterpart isn't lol)

Edit: Silly me, forgot to use inspect tool to investigate. See it is apart of LLL submenu sys., so hopefully now I can toy around with this!

Reroute nodes keep disconnecting

For some reasons, two reroute nodes keep appearing disconnected whenever I refresh the browser or I reload my workflow. These two, and only these two:

Screenshot 2023-09-04 at 12 04 37

I even tried to delete them and recreate them, but the problem persists. Even more strangely, sometimes they remain connected upon refresh, and the CLIPTextEncodeSDXL node they connect to seems to be working fine even when they are disconnected.

Notice that this issue has existed for a long time, not just in the current version of the workflow. I just forgot to report it.

The JSON is here.

Feature request: Context Big passing checkpoint name

Just like Context Big now passes sampler and scheduler name, it would be great if it could pass the checkpoint name, too, and separately from the model value. So I could address this unfortunate situation:

Screenshot 2023-09-16 at 23 10 47

In reality, that checkpoint name doesn't have to be transported any further than the Loader node right there, but, for the sake of having a separate control panel, it would be extremely useful.

Thank you

Installer issue.

Not a big issue. but when downloading the zip linked here. it saves out with the incorrect folder name, this throws errors with comfyui as it can't find the path it is trying to run.
delete -main from the end and it fixes it.
thats if your installing it this way instead of following the instructions. btw :P

Fix for multiple DisplayInt nodes

When there are multiple DisplayInt nodes in a graph, whenever any of them gets executed, the text widget of the most recently created node gets updated instead of the text widget of the executing node because the text widget variable showValueWidget in js/display_int.js is being shared between all the nodes. This can be fixed by attaching it to the node object (this) instead:

 import { app } from "../../scripts/app.js";
 import { ComfyWidgets } from "../../scripts/widgets.js";
 import { addConnectionLayoutSupport } from "./utils.js";
 app.registerExtension({
     name: "rgthree.DisplayInt",
     async beforeRegisterNodeDef(nodeType, nodeData, app) {
         if (nodeData.name === "Display Int (rgthree)") {
-            let showValueWidget;
             nodeType.title_mode = LiteGraph.NO_TITLE;
             const onNodeCreated = nodeType.prototype.onNodeCreated;
             nodeType.prototype.onNodeCreated = function () {
                 onNodeCreated ? onNodeCreated.apply(this, []) : undefined;
-                showValueWidget = ComfyWidgets["STRING"](this, "output", ["STRING", { multiline: true }], app).widget;
-                showValueWidget.inputEl.readOnly = true;
-                showValueWidget.serializeValue = async (node, index) => {
+                this.showValueWidget = ComfyWidgets["STRING"](this, "output", ["STRING", { multiline: true }], app).widget;
+                this.showValueWidget.inputEl.readOnly = true;
+                this.showValueWidget.serializeValue = async (node, index) => {
                     node.widgets_values[index] = '';
                     return '';
                 };
             };
             addConnectionLayoutSupport(nodeType, app, [['Left'], ['Right']]);
             const onExecuted = nodeType.prototype.onExecuted;
             nodeType.prototype.onExecuted = function (message) {
                 onExecuted === null || onExecuted === void 0 ? void 0 : onExecuted.apply(this, [message]);
-                showValueWidget.value = message.text[0];
+                this.showValueWidget.value = message.text[0];
             };
         }
     },
 });

Workflow Link Fixer doesn't save the fixed workflow

This explains why so many people on Reddit told me that whenever they load my workflow the find a bunch of nodes disconnected, and why on random reloads, I experience the same with two nodes:

Screenshot 2023-09-16 at 20 29 42

However, I cannot fix. When I click the fix and save button, nothing happens. Tried on Vivaldi and Safari. No ad blockers.

Thank you so much!

I don't know who you are, but thank you so much for making these nodes, they make the ComfyUI much easier to use!

comfyui search become very long

Comfyui search become very long because the samplers get added to the search options this happens when I activate this node.
normal (without the node)
image

with the node
image
image

Non-backwards compatible changes & tagging

Hey, thanks for your excellent work! Just recently (3hrs ago), you released some non-backwards compatible changes to the repo. That's fine and you can do what you want. Could you please tag your versions though so I can version lock this repo for my workflow?

KSampler is not working

Error occurred when executing KSampler:

module 'comfy.sample' has no attribute 'broadcast_cond'

File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "C:\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "C:\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1237, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
File "C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\AIT\AITemplate\AITemplate.py", line 176, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
return original_sample(*args, **kwargs)
File "C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 109, in animatediff_sample
return orig_comfy_sample(model, *args, **kwargs)
File "C:\ComfyUI_windows_portable\ComfyUI\custom_nodes\AIT\AITemplate\AITemplate.py", line 290, in sample
positive_copy = comfy.sample.broadcast_cond(positive, noise.shape[0], device)

Reroute node compatibility with Save Image w/ Metadata node (Comfy Image Saver suite)

Partially connected to #24 (the quest to transmit the checkpoint name over long distances), I'm encountering challenges in connecting a Checkpoint Selector node, from Comfy Image Saver suite, to their Save Image w/Metadata node, passing through your reroute node.

I opened an Issue with them:

giriss/comfy-image-saver#12

This is for visibility and documention only. I'll close it once it gets solved on the other side.

add enhancement reroute

add the possibility to use resizing like in standard ComfyUI, by placing the cursor to the bottom right corner of the node and resize it by dragging and dropping.
and in your implementation there are no additional menus :
image
as well as a severe size limitation :
image

Thank you so much for your custom nodes

Suggestion: Custom Pipe Node

The 'pipe' nodes already exist, but they are predefined. I rarely use them because they don't allow me to transmit all the data I want.

The idea would be to connect any node to the 'custom pipe' node in order to truly have a workflow with as few lines as possible.

custom pipe

Do you think such a node makes sense, and does it appeal to you?

power prompt lora prompt found and loaded problem

There are the problem when I load some lora ( It found the another lora even have right path):

๐Ÿ›ˆ rgthree Power Prompt: Found "Cloth\Rich.safetensors" for "XQ\ziedRichDota2d" in prompt
โœ“ rgthree Power Prompt: Loaded "Cloth\Rich.safetensors" from prompt
๐Ÿ›ˆ rgthree Power Prompt: 1 Loras processed; stripping tags for TEXT output.

title reroutes node

Your 'reroute' node is great, but would it be possible to display the title?

Request: Combine to Mute or Bypass all combined inputs at once

Title says it all. I want to combine a bunch of inputs into one 'button' to press to be able to enable on/off certain generations and features.

Current implementation, I don't see a way to do this, as once combined each input is still individually selectable

Personally, I would implement it as a toggle to the combiner -> (combine inputs into 1)

Reroute nodes keep disconnecting

Unfortunately, I have to report that the recent change did not fix the issue. After the Fix in Place of last week, I still have a few nodes that keep disconnecting randomly every time I reload the workflow or restart ComfyUI.

What's worse, these nodes would be very hard to impossible to troubleshoot for anyone not intimately familiar with the design decision I made for my workflow. And the new fix function doesn't even detect the issue.

Version 4.0 is ready to go, but I cannot release it until the reroute nodes issue is solved, and I only use your reroute nodes :)

Happy to share the new workflow, if you need it.

Graphic glitch

This is not really an issue, more of an inconsistency..
Your excellent Nodes don't take into account the "Link Render Mode" setting of the user and draw things like its always in the "Spline" setting.
I use "Linear" and it draws the cable's middle dot in an offset
Bug1a
The reason must be that it thinks the cables are like this
Bug1b
Not a big deal, just a glitch.. ๐Ÿ˜‰

.. and another thanks for your really great Nodes!

Node Disconnection from bypassers when used as a template (possible bug)

image

The left side works as expected, but when I save this as a template, many of the effects nodes become disconnected from the bypassers, requiring a manual re-attachment for the bypasser to work again. At first, I put all the connections into a node collector, but then thinking that might be the issue, I hooked them right into the bypass repeater itself, which are connected to a "disable all fx" switches at the top, and then I also hooked them up so each effect can be bypassed/enabled individually.

All my reroutes are rerouteprimitive/pyssss versus native, as I read that disconnections can happen with the regular reroute.

When saved as a workflow, all is well - no issues. Only difference is that in the upper left, there's a connection point for an image, whereas in the full 'workflow" version, there's an Input LoadImage (The template version is designed to be chained, hence the difference).

Attached is the JSON from the Template if that helps.
AegisFlow_MaskFX_Template.json

input files:
maskfx
swimsuit_watercolor
horizontal mask

Default size reroute

Hello, Is it possible to choose a standard reroute size without changing the code?

bandicam.2023-10-01.14-50-11-527.mp4

At 23 seconds, you can notice a problem with the small reroute that I encounter

Feature request: "Hub" node

Almost every day, I find myself wanting a node that:

  1. accepts any input (even multiple inputs of the same type)
  2. picks the first non-null input as output (similarly to what the Context Switch node does)

For example, right now, I'd really need it to pick the first non-null image among many possible generated by different parts of my workflow and send that image to another node to be saved with metadata readable by A1111/SD.Next/SD Prompt Reader:

Screenshot 2023-09-17 at 09 50 40

Maybe it's the wrong way to think about the flow, but it's very intuitive to me. And I had other use cases for such a node.

Thanks!

USE CASE question: how to use bypass with a "conditional" workflow

Hi,

I have a node that picks a prompt process based random integer. Given an input, the workflow runs one of the nodes that corresponds to the integer picked. However,r in the console, I see the non-selected nodes are still doing their prompting. Is it possible to use the mute/bypass nodes to turn them off? The workflow is shown below:

image

So really what I want to do is if Wildcards is picked, to bypass/mute the other three, for example.

Bypasser not bypassing? (or me not understanding?)

I'm starting to experiment with the Mute / Bypass Repeater node, using it to bypass the entire Detailer function of my workflow, rather than just some portions of it. It's very useful to save computing cycles, but I'm afraid I've not completely grasped how it works.

In the situation below, I have two main functions generating an image: either the SDXL Base+Refiner or the ReVision. They end up into a context switch which passes the non-null image to the Detailer and the Upscaler, organized in a chain:

Context Switch Test - Detail

I hook all the nodes of the Detailer to a Mute / Bypass Repeater node and that node to a Bypasser node.

At this point, I ask to bypass the Detailer, but not the Upscaler:

Screenshot 2023-09-03 at 01 09 25

I would expect that the generated image (coming either from the Refiner or ReVision) would pass through all the Detailer nodes and end up being processed by the Upscaler. But no.

If the Detailer is bypassed, ComfyUI simply refuses to work. I press Queue Prompt and nothing happens. No errors, no warnings, no console messages. It simply doesn't do anything.

Similarly, if I want to bypass both the Detailer and the Upscaler, and simply keep my image as it comes out of the Refiner or ReVision, I can't. ComfyUI doesn't operate.

For the workflow to work, I have to un-bypass at least the Detailer. Then, it does its thing splendidly. All your nodes are very powerful and useful, thank you.

As usual, the full workflow for reference:

Context Switch Test

'NoneType' object has no attribute 'movedim'

I am new to this so I don't understand what is the problem. Here is the error when I run stable diffusion:
image
I was trying to load a workflow but when I got everything that I needed for it to run correctly it didn't work:
image
What is the reason for this problem and how can I fix this?

Feature Request: Lora Loader Stack view info feature.

It'd be nice if you incorporated the original Lora Loader's View Info feature, that looks up the Lora's info on Civitai to glean trigger words.

You could have a sub menu;

  • View Info >
    • View Lora 01 Info
    • View Lora 02 Info
    • View Lora 03 Info
    • View Lora 04 Info

Action button collector

I have an idea: add a node that collects fast action button for centralized control of action.
It will highlight the most recently run action(Highlight the first action if no action has been run), like this

Screenshot 2023-10-15 205425

and can be set to a suspended panel, like this

Animation (5)

It would be better if you can add an animation for running action
Animation (1)

Fast muter not working with Lora Stack

image
In example above it works only if the first lora stacker is enabled. Enabling any other or any combination of multiple stackers are not working (or only applying first one if enabled).

In fact when any stacker other than first is enabled the output is no longer lora_stack as shown by apply lora stack node throwing a warning
image

Error with Context Switch with all null input

Normally I can turn off functionality by breaking the input connection to a part of the circuit I don't want to run - I can't do that with just a context switch though by disabling the inputs - it will stop the process and throw an error - I'm required to put another context block after the switch and mute that instead. This is overly complicated and frustrating, especially without any logic blocks or the ability to disable multiple blocks with one click to work around it more easily.

Nested context muting

Preface:

Perhaps I'm just oblivious to an obvious logic problem

Issue:

Context connected to the output of a muted context are still processed resulting in errors

While attempting to expand my workflow with toggle-able options via muting, I tried using context muting at the end of a mutable section of workflow. Once I muted said section, I would receive errors upon attempting to run. Is this something that can be done as is, or would this require additions to the context module on your part? As I prefaced, perhaps I'm just missing something.

Example:

In my use case I was attempting to create toggles allowing for the saving multiple image outputs in multiple locations using mutable context into the WAS Suite's Image saver. Here's a mock-up example where the second upscaler (U2) workflow is muted:
Screenshot_2023-08-22_02-30-59

Obviously I could mute each of the contexts corresponding to a muted section, but in a more complex non mock-up workflow this would quickly become tedious.

Would it be possible to add a 'Mute' boolean to the I/O of contexts? If the context is able to read and change its mute state according to the boolean it would allow for more complex workflows; especially if it can be carried by the pipe and overridden by new input of the boolean like other I/O. Here's another basic mock-up with U2 disabled showing how such a workflow could be beneficial:
Screenshot_2023-08-22_02-26-42

This functionality would nicely complement the logic nodes of other custom nodes packs.

Post Script:

Seriously I can not thank you enough for the amazing work you've done in releasing these nodes, they really are a game changer!

Feature request: Display Any node printing checkpoint name + passing inputs to other nodes

As the title says: the new Display Any node is very useful for debugging, but it would be tremendously more useful if it would do two additional things:

  1. Print the checkpoint name in the same way it does for the sampler name and the scheduler name. Right now, an attempt to link a Model output in a Checkpoint Loader node to the input of a Display Any node generates something like this: <comfy.model_patcher.ModelPatcher object at 0x2d0f6b950>

  2. Pass any received input to other nodes. The printing within the node is much needed (I use the ttn textDebug for this, but it stopped printing the name inside the node weeks ago and it's not optimal). However, values like checkpoint name, sampler name, and scheduler name I'd like to print to terminal, concat with my labels, and save to files for various reasons.

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.