Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

split node WD14Tagger|pysssss #80

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open

Conversation

JeryZeng
Copy link

split WD14Tagger|pysssss into WD14ModelLoader|pysssss and WD14TaggerOnly|pysssss to use comfyui cache and reduce tag run cost

  • WD14ModelLoader|pysssss cache onnx infer session and avoid load model every time
  • WD14TaggerOnly|pysssss just do inference

…nly|pysssss to use comfyui cache and reduce tag run cost
@JeryZeng
Copy link
Author

i didn't remove the node WD14Tagger|pysssss for compatibility reason

@metrosound
Copy link

metrosound commented Aug 30, 2024

I tried to use your version and it was great!, but I got an error if I tried to join string using the Join Strings node. it was working fine if I used the original version..but I got the error with yours

Error occurred when executing JoinStrings:

can only concatenate list (not "str") to list

File "F:\StabilityMatrix\Data\Packages\ComfyUI\execution.py", line 317, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "F:\StabilityMatrix\Data\Packages\ComfyUI\execution.py", line 192, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "F:\StabilityMatrix\Data\Packages\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "F:\StabilityMatrix\Data\Packages\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "F:\StabilityMatrix\Data\Packages\ComfyUI\custom_nodes\ComfyUI-KJNodes\nodes\nodes.py", line 217, in joinstring
joined_string = string1 + delimiter + string2

@t00350320
Copy link

same error

@t00350320
Copy link

I tried to use your version and it was great!, but I got an error if I tried to join string using the Join Strings node. it was working fine if I used the original version..but I got the error with yours

Error occurred when executing JoinStrings:

can only concatenate list (not "str") to list

File "F:\StabilityMatrix\Data\Packages\ComfyUI\execution.py", line 317, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "F:\StabilityMatrix\Data\Packages\ComfyUI\execution.py", line 192, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "F:\StabilityMatrix\Data\Packages\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "F:\StabilityMatrix\Data\Packages\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "F:\StabilityMatrix\Data\Packages\ComfyUI\custom_nodes\ComfyUI-KJNodes\nodes\nodes.py", line 217, in joinstring
joined_string = string1 + delimiter + string2

a list result will not be joined with other string contents prompt, just add a node to transfer the list to string , you can
modify the author's wd14tagger.py file like this:

defaults = {
    "model": "wd-v1-4-moat-tagger-v2",
    "threshold": 0.35,
    "character_threshold": 0.85,
    "replace_underscore": False,
    "trailing_comma": False,
    "exclude_tags": "",
    "ortProviders": ["CUDAExecutionProvider", "CPUExecutionProvider"],
    "HF_ENDPOINT": "https://huggingface.co",
    "list_string": ""
}
class WD14TaggerListTostring:

    @classmethod
    def INPUT_TYPES(s):
        return {"required": {
            "list_string": ("STRING", {"default": defaults["list_string"]}),
        }}

    RETURN_TYPES = ("STRING",)
    FUNCTION = "list2string"

    CATEGORY = "image"

    def list2string(self, list_string=""):
        print("list2string")
        print(list_string)
        string1 = list_string[0]
        print(string1)
        cleaned_str = string1.strip('[]"')
        print(cleaned_str)
        return (cleaned_str,)

NODE_CLASS_MAPPINGS = {
    "WD14Tagger|pysssss": WD14Tagger,
    "WD14ModelLoader|pysssss": WD14ModelLoader,
    "WD14TaggerOnly|pysssss": WD14TaggerOnly,
    "WD14TaggerListTostring": WD14TaggerListTostring,

@JeryZeng
Copy link
Author

JeryZeng commented Sep 4, 2024

@metrosound @t00350320 fixed

@samsa-ai-admin
Copy link

Error occurred when executing WD14Tagger|pysssss:

[ONNXRuntimeError] : 1 : FAIL : Load model from /home/ubuntu/ComfyUI/custom_nodes/ComfyUI-WD14-Tagger/models/wd-swinv2-tagger-v3.onnx failed:/onnxruntime_src/onnxruntime/core/graph/model_load_utils.h:46 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map, int>&, const onnxruntime::logging::Logger&, bool, const string&, int) ONNX Runtime only guarantees support for models stamped with official released onnx opset versions. Opset 4 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx.ml is till opset 3.

File "/home/ubuntu/ComfyUI/execution.py", line 152, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "/home/ubuntu/ComfyUI/execution.py", line 82, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "/home/ubuntu/ComfyUI/execution.py", line 75, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "/home/ubuntu/ComfyUI/custom_nodes/ComfyUI-WD14-Tagger/wd14tagger.py", line 200, in tag
tags.append(wait_for_async(lambda: tag(image, model, threshold, character_threshold, exclude_tags, replace_underscore, trailing_comma)))
File "/home/ubuntu/ComfyUI/custom_nodes/ComfyUI-WD14-Tagger/pysssss.py", line 214, in wait_for_async
loop.run_until_complete(run_async())
File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
return future.result()
File "/home/ubuntu/ComfyUI/custom_nodes/ComfyUI-WD14-Tagger/pysssss.py", line 204, in run_async
r = await async_fn()
File "/home/ubuntu/ComfyUI/custom_nodes/ComfyUI-WD14-Tagger/wd14tagger.py", line 58, in tag
model = InferenceSession(name, providers=defaults["ortProviders"])
File "/opt/tensorflow/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 383, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/opt/tensorflow/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 424, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)

@JeryZeng
Copy link
Author

Error occurred when executing WD14Tagger|pysssss:

[ONNXRuntimeError] : 1 : FAIL : Load model from /home/ubuntu/ComfyUI/custom_nodes/ComfyUI-WD14-Tagger/models/wd-swinv2-tagger-v3.onnx failed:/onnxruntime_src/onnxruntime/core/graph/model_load_utils.h:46 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map, int>&, const onnxruntime::logging::Logger&, bool, const string&, int) ONNX Runtime only guarantees support for models stamped with official released onnx opset versions. Opset 4 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx.ml is till opset 3.

File "/home/ubuntu/ComfyUI/execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "/home/ubuntu/ComfyUI/execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "/home/ubuntu/ComfyUI/execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) File "/home/ubuntu/ComfyUI/custom_nodes/ComfyUI-WD14-Tagger/wd14tagger.py", line 200, in tag tags.append(wait_for_async(lambda: tag(image, model, threshold, character_threshold, exclude_tags, replace_underscore, trailing_comma))) File "/home/ubuntu/ComfyUI/custom_nodes/ComfyUI-WD14-Tagger/pysssss.py", line 214, in wait_for_async loop.run_until_complete(run_async()) File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete return future.result() File "/home/ubuntu/ComfyUI/custom_nodes/ComfyUI-WD14-Tagger/pysssss.py", line 204, in run_async r = await async_fn() File "/home/ubuntu/ComfyUI/custom_nodes/ComfyUI-WD14-Tagger/wd14tagger.py", line 58, in tag model = InferenceSession(name, providers=defaults["ortProviders"]) File "/opt/tensorflow/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 383, in init self._create_inference_session(providers, provider_options, disabled_optimizers) File "/opt/tensorflow/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 424, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)

it's myabe an onnx version or compatibility issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants