-
Notifications
You must be signed in to change notification settings - Fork 138
Open
Labels
Description
Summary
When using OpenAISubscriptionAuth.create_llm() multiple times (e.g., for main agent LLM and condenser LLM), the LLMRegistry raises a ValueError because both LLMs get the same default usage_id of "default".
Error Message
ValueError: Usage ID 'default' already exists in registry. Use a different usage_id on the LLM or call get() to retrieve the existing LLM.
Root Cause
LLM.usage_iddefaults to"default"(seellm.py:313-320)create_llm()passes**llm_kwargstoLLM()but doesn't set a defaultusage_id- When the CLI creates two LLMs via
create_llm()(one for agent, one for condenser), both getusage_id="default" LLMRegistry.add()raisesValueErroron duplicateusage_id
Current Workaround
Callers must explicitly pass usage_id:
# Main LLM
llm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_aute `create_llm(llm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_aute `create_llm(llm = chate_llm(self, model: str = "gpllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt_autllm = chatgpt().hex[:8]}"
# ... rest # ... rest # ... rest # ... rest # ...meter
Make `usage_id` a required parameter for `create_llm()`:
```python
def create_llm(
self,
model: str = "gpt-5.2-codex",
usage_id: str, # Required
credentials: OAuthCredentials | None = None,
...
) -> LLM:Option 3: Document the requirement
Add clear Adcumentation that callAdd clear Adcumentation that e_id` values when creating multiple LLMs.
Additional Context
The usage_id is used for:
- Registry lookups (
LLMRegistry.get()) - Telemetry and spend tracking
- Metrics collection (
ConversationStats)
Having meaningful, unique usage_id values (like "agent", "condenser", "ask-agent-llm") is important for debugging and monitoring, so Option 1 with auto-generation might produce less useful IDs than explicit naming.
Related
Found while implementing ChatGPT subscription login in OpenHands CLI.
See also: LLMRegistry in llm_registry.py
Reactions are currently unavailable