anndict.get_llm_config#
- anndict.get_llm_config()[source]#
Retrieves the LLM configuration from environment variables.
This function loads the LLM provider and model from environment variables, validates the provider against available providers, and constructs a configuration dictionary. It also includes any additional environment variables prefixed with
'LLM_'
in the configuration.- Return type:
dict
[str
,Any
]- Returns:
A dictionary containing LLM configuration with the following keys:
- provider
str
The LLM provider name from
LLM_PROVIDER
env var- model
str
The model identifier from
LLM_MODEL
env var- class
str
The provider’s class name for instantiation
- module
str
The provider module’s path
Additional keys are included from any environment variables prefixed with
'LLM_'
, excludingLLM_PROVIDER
andLLM_MODEL
- provider
- Raises:
ValueError – If the specified provider is not in the list of supported providers
Examples
configure_llm_backend('openai', 'gpt-3.5-turbo', api_key='your-openai-api-key' ) config = get_llm_config() print(config['provider']) > 'openai' print(config['model']) > 'gpt-3.5-turbo'