anndict.get_llm_config

Contents

anndict.get_llm_config#

anndict.get_llm_config()[source]#

Retrieves the LLM configuration from environment variables.

This function loads the LLM provider and model from environment variables, validates the provider against available providers, and constructs a configuration dictionary. It also includes any additional environment variables prefixed with 'LLM_' in the configuration.

Return type:

dict[str, Any]

Returns:

A dictionary containing LLM configuration with the following keys:

providerstr

The LLM provider name from LLM_PROVIDER env var

modelstr

The model identifier from LLM_MODEL env var

classstr

The provider’s class name for instantiation

modulestr

The provider module’s path

Additional keys are included from any environment variables prefixed with 'LLM_', excluding LLM_PROVIDER and LLM_MODEL

Raises:

ValueError – If the specified provider is not in the list of supported providers

Examples

configure_llm_backend('openai',
    'gpt-3.5-turbo',
    api_key='your-openai-api-key'
)

config = get_llm_config()
print(config['provider'])
> 'openai'
print(config['model'])
> 'gpt-3.5-turbo'