anndict.configure_llm_backend

anndict.configure_llm_backend#

anndict.configure_llm_backend(provider, model, **kwargs)[source]#

Call this function before using LLM integrations. Configures the LLM backend by setting environment variables.

Parameters:
provider str

The LLM provider name. Run LLMProviders.get_providers() to view list of supported providers.

model str

The LLM model name.

**kwargs

Additional configuration parameters passed to LLMManager.configure_llm_backend()

Return type:

None

Examples

General (for most providers)

configure_llm_backend('your-provider-name',
    'your-provider-model-name',
    api_key='your-provider-api-key'
)

OpenAI

configure_llm_backend('openai',
    'gpt-3.5-turbo',
    api_key='your-openai-api-key'
)

Anthropic

configure_llm_backend('anthropic',
    'claude-3-5-sonnet-20240620',
    api_key='your-anthropic-api-key'
)

Google

configure_llm_backend('google',
    'gemini-1.5-pro',
    api_key='your-google_ai-api-key'
)

Bedrock

configure_llm_backend('bedrock',
    'anthropic.claude-v2',
    region_name='us-west-2',
    aws_access_key_id='your-access-key-id',
    aws_secret_access_key='your-secret-access-key'
)

AzureML Endpoint

configure_llm_backend('azureml_endpoint',
    'llama-2',
    endpoint_name='your-endpoint-name',
    region='your-region',
    api_key='your-api-key'
)