LLMConfigObject#
classDiagram
DataObject <|-- LLMConfigObject
- class council.llm.LLMConfigObject(kind: str, version: str, metadata: DataObjectMetadata, spec: T)[source]#
Bases:
DataObject
[LLMConfigSpec
]Helper class to instantiate an LLM from a YAML file
The following code illustrates the way an LLM could be loaded from a YAML file.
from council.llm import OpenAILLM, LLMConfigObject
llm_config = LLMConfigObject.from_yaml("data/openai-llm-model.yaml")
llm = OpenAILLM.from_config(llm_config)
kind: LLMConfig
version: 0.1
metadata:
name: an-openai-deployed-model
labels:
provider: OpenAI
spec:
description: "Model used to do ABC"
provider:
name: CML-OpenAI
openAISpec:
model: gpt-4-1106-preview
timeout: 60
apiKey: sk-my-api-key
# Alternatively, get the api key from an environment variable
# apiKey:
# fromEnvVar: OPENAI_API_KEY
parameters:
n: 3
temperature: 0.5