MonitoredLLM#

class council.llm.MonitoredLLM(name: str, llm: LLMBase)[source]#

Bases: Monitored[LLMBase]

A convenience class that wraps an LLM into a Monitor

post_chat_request(context: ContextBase, messages: Sequence[LLMMessage], budget: Budget | None = None, **kwargs: Any) LLMResult[source]#

make a call to the wrapped llm, managing the creation of the context. See LLMBase.post_chat_request()

Parameters:
Returns:

see LLMBase.post_chat_request()

Return type:

LLMResult