Documentation
¶
Overview ¶
Package components includes AgentMemory etc.
Index ¶
- func ToolCallbacksToAnthropic(src []ToolCallback, dist *anthropic.Message)
- func ToolCallbacksToGemini(src []ToolCallback, dist *gemini.Content)
- func ToolCallbacksToOpenAI(src []ToolCallback) []openai.ChatCompletionMessageParamUnion
- func ToolCallsToAnthropic(src []ToolCall, dist *anthropic.Message)
- func ToolCallsToGemini(src []ToolCall, dist *gemini.Content)
- func ToolCallsToOpenAI(src []ToolCall, dist *openai.ChatCompletionMessageParamUnion)
- type LLMResponse
- type LLMUsage
- type ToolCall
- type ToolCallback
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func ToolCallbacksToAnthropic ¶ added in v1.6.3
func ToolCallbacksToAnthropic(src []ToolCallback, dist *anthropic.Message)
func ToolCallbacksToGemini ¶ added in v1.6.3
func ToolCallbacksToGemini(src []ToolCallback, dist *gemini.Content)
func ToolCallbacksToOpenAI ¶ added in v1.6.3
func ToolCallbacksToOpenAI(src []ToolCallback) []openai.ChatCompletionMessageParamUnion
func ToolCallsToAnthropic ¶ added in v1.6.3
func ToolCallsToGemini ¶ added in v1.6.3
func ToolCallsToOpenAI ¶ added in v1.6.3
func ToolCallsToOpenAI(src []ToolCall, dist *openai.ChatCompletionMessageParamUnion)
Types ¶
type LLMResponse ¶ added in v1.1.0
type LLMResponse struct {
ID string `json:"id,omitempty"`
Role instructor.Role `json:"role,omitempty"`
Model string `json:"model,omitempty"`
Usage *LLMUsage `json:"usage,omitempty"`
Timestamp int64 `json:"ts,omitempty"`
Details any `json:"content,omitempty"`
}
LLMResponse instructor provider chat response
func (*LLMResponse) FromAnthropic ¶ added in v1.1.0
func (r *LLMResponse) FromAnthropic(v *anthropic.MessagesResponse)
FromAnthropic convert response from anthropic
func (*LLMResponse) FromCohere ¶ added in v1.1.0
func (r *LLMResponse) FromCohere(v *cohere.NonStreamedChatResponse)
FromCohere convert response from cohere
func (*LLMResponse) FromGemini ¶ added in v1.1.0
func (r *LLMResponse) FromGemini(v *gemini.GenerateContentResponse)
func (*LLMResponse) FromOpenAI ¶ added in v1.1.0
func (r *LLMResponse) FromOpenAI(v *openai.ChatCompletion)
FromOpenAI convnert response from openai
type LLMUsage ¶ added in v1.1.0
Directories
¶
| Path | Synopsis |
|---|---|
|
Package document contains Document structs and Parsers prepare for RAG
|
Package document contains Document structs and Parsers prepare for RAG |
|
parsers
Package parsers include different parsers implementation
|
Package parsers include different parsers implementation |
|
parsers/docx
Package docx a parser for docx
|
Package docx a parser for docx |
|
parsers/html
Package html a parser for html
|
Package html a parser for html |
|
parsers/pdf
Package pdf a parser for PDF
|
Package pdf a parser for PDF |
|
parsers/pptx
Package pptx a Parser for pptx
|
Package pptx a Parser for pptx |
|
parsers/xlsx
Package xlsx a xlsx parser
|
Package xlsx a xlsx parser |
|
Package embedder contains Embedder interface and different providers including openai, voyageai, coheren, gemini and huggingface, etc.
|
Package embedder contains Embedder interface and different providers including openai, voyageai, coheren, gemini and huggingface, etc. |
|
splitter
Package splitter defines different chunker spliters
|
Package splitter defines different chunker spliters |
|
Package optimizer provides prompt optimization capabilities for Language Learning Models.
|
Package optimizer provides prompt optimization capabilities for Language Learning Models. |
|
Package systemprompt contains system prompt generator and context provider
|
Package systemprompt contains system prompt generator and context provider |
|
Package vectordb contains vectordb interface and different engines like memory, chromem and milvus implementations.
|
Package vectordb contains vectordb interface and different engines like memory, chromem and milvus implementations. |
Click to show internal directories.
Click to hide internal directories.