anndict.call_llm#
- anndict.call_llm(messages, **kwargs)[source]#
Call the configured LLM model with given messages and parameters.
- Parameters:
- messages
list[dict[str,str]] - List of message dictionaries, where each dictionary contains:
- ’role’
str The role of the message sender (
'system','user', or'assistant')- ’content’
str The content of the message
- ’role’
- **kwargs
Additional keyword arguments passed to the LLM provider.
- Common parameters include:
- supports_system_messages
bool, optional Whether the model supports system messages (default:
True)
- supports_system_messages
- messages
- Return type:
str- Returns:
The stripped content of the LLM’s response.
Notes
The function performs the following steps:
Gets LLM configuration and initializes the provider
Converts messages to appropriate
LangChainmessage typesCalls the LLM with the processed messages
Writes the response to a file specified by
RESPONSE_PATHenv variable
The response is written to
'./response.txt'by default ifRESPONSE_PATHis not set.See also
LLMManagerClass handling LLM configuration and use.
LLMProviders.get_providers()To see supported providers.