ComfyUI-LLM-Session
by kantan-kanto0 starssuccess
Local LLM session nodes for ComfyUI using GGUF and llama.cpp, supporting Llama, Mistral, Qwen, DeepSeek, GLM, Gemma, Phi, LLaVA and gpt-oss, enabling both user–model chat and model-to-model dialogue without external runtimes like Ollama.
View on GitHubNodes (0)
No node definitions found for this pack.