anndict.call_llm

Contents

anndict.call_llm#

anndict.call_llm(messages, **kwargs)[source]#

Call the configured LLM model with given messages and parameters.

Parameters:
messages list[dict[str, str]]

List of message dictionaries, where each dictionary contains:
’role’str

The role of the message sender ('system', 'user', or 'assistant')

’content’str

The content of the message

**kwargs

Additional keyword arguments passed to the LLM provider.

Common parameters include:
supports_system_messagesbool, optional

Whether the model supports system messages (default: True)

Return type:

str

Returns:

The stripped content of the LLM’s response.

Notes

The function performs the following steps:

  1. Gets LLM configuration and initializes the provider

  2. Converts messages to appropriate LangChain message types

  3. Calls the LLM with the processed messages

  4. Writes the response to a file specified by RESPONSE_PATH env variable

The response is written to './response.txt' by default if RESPONSE_PATH is not set.

See also

LLMManager

Class handling LLM configuration and use.

LLMProviders.get_providers()

To see supported providers.