MonitoredLLM#
- class council.llm.MonitoredLLM(name: str, llm: LLMBase)[source]#
Bases:
Monitored
[LLMBase
]A convenience class that wraps an LLM into a Monitor
- post_chat_request(context: ContextBase, messages: Sequence[LLMMessage], budget: Budget | None = None, **kwargs: Any) LLMResult [source]#
make a call to the wrapped llm, managing the creation of the context. See
LLMBase.post_chat_request()
- Parameters:
context (ContextBase) – the context of the caller
messages (Sequence[LLMMessage]) – see
LLMBase.post_chat_request()
budget (Optional[Budget]) – an optional budget. If none, the budget from the given context is used
**kwargs – see
LLMBase.post_chat_request()
- Returns:
- Return type: