🤖 Ollama LLM

ComfyUI_LLM_Ollama

Pack: comfyUI_LLM

custom_nodes.comfyUI_LLM

Inputs (8)

NameTypeRequired
promptSTRINGrequired
modelCOMBOrequired
temperatureFLOATrequired
max_tokensINTrequired
hide_thoughtsBOOLEANrequired
contextSTRINGoptional
system_messageSTRINGoptional
stop_sequencesSTRINGoptional

Outputs (2)

NameType
responseSTRING
contextSTRING