Get LLM Response

SGD_Get_Llm_Response

Get response from the provided llama.cpp model

Pack: Sagado Nodes for ComfyUI

custom_nodes.ComfyUI-Sagado-Nodes

Inputs (8)

NameTypeRequired
modelMODELrequired
promptSTRINGrequired
temperatureFLOATrequired
max_tokensINTrequired
top_pFLOATrequired
seedINTrequired
image_pathSTRINGoptional
image_base64STRINGoptional

Outputs (2)

NameType
response_messageSTRING
full_responseSTRING