Releases: parakeet-nest/parakeet
v0.2.9 🥧 [pie]
v0.2.8 🍩 [doughnut]
- Landing of Squawk: a Parakeet DSL
squawk.New(). Model(model). BaseURL(ollamaBaseUrl). Provider(provider.Ollama). Options(options). System("You are a useful AI agent, you are a Star Trek expert."). User("Who is James T Kirk?"). Chat(func(answer llm.Answer, self *squawk.Squawk, err error) { fmt.Println(answer.Message.Content) })
- Improvement of the history messages management
- Added support for structured output to the Docker Model Runner Chat API
- Added support for structured output to the OpenAI Chat API
🦜 Parakeet v0.2.7 🐳 [spouting whale]
Addition of Docker Model Runner support (and OpenAI at the same time) allowing easy development of generative AI applications in Docker containers.
🦜 Parakeet v0.2.6 🍿 [popcorn]
Release notes
v0.2.6 🍿 [popcorn]
-
MCP support progress:
- SSE transport Client
- new STDIO transport Client
-
Added a SSE MCP example using the WASImancer MCP server project:
75-mcp-sse
-
Update of the STDIO MCP example:
67-mcp
🦜 Parakeet 0.2.5 🥧 [pie]
Release notes
v0.2.5 🥧 [pie]
Helpers
Estimate the number of tokens in a text
content.CountTokens(text string) int
content.CountTokensAdvanced(text string) int
content.EstimateGPTTokens(text string) int
this could be useful to estimate the value of
num_ctx
Extract elements from source code:
source.ExtractCodeElements(fileContent string, language string) ([]CodeElement, error)
// CodeElement represents a code structure element (class, function, method)
type CodeElement struct {
Type string // "class", "function", "method"
Name string
Signature string
Description string
LineNumber int
ParentClass string // For methods
Parameters []string
Source string // Source code of the element
}
the
Signature
could be useful to add context to embeddings.
Get and cast environment variable value at the same time:
gear.GetEnvFloat(key string, defaultValue float64) float64
gear.GetEnvInt(key string, defaultValue int) int
gear.GetEnvString(key string, defaultValue string) string
Conversational history + new samples
In Memory
- Added
history.RemoveMessage(id string)
- see example:
69-web-chat-bot
- see example:
- Added
history.SaveMessageWithSession(sessionId string, messagesCounters *map[string]int, message llm.Message)
- see example:
70-web-chat-bot-with-session
- see example:
- Added
history.RemoveTopMessageOfSession(sessionId string, messagesCounters *map[string]int, conversationLength int)
- see example:
70-web-chat-bot-with-session
- see example:
Bbolt Memory
- Added
history.RemoveMessage(id string)
- Added
history.SaveMessageWithSession(sessionId string, messagesCounters *map[string]int, message llm.Message)
- see example:
71-web-chat-bot-with-session
- see example:
- Added
history.RemoveTopMessageOfSession(sessionId string, messagesCounters *map[string]int, conversationLength int)
- see example:
71-web-chat-bot-with-session
- see example:
v0.2.4 🥮 [mooncake]
v0.2.4 🥮 [mooncake]
RAG
Improving the RAG example with Elasticsearch: 40-rag-with-elastic-markdown
(🙏 Thank you @codefromthecrypt)
New examples:
- Structured output:
66-structured-outputs
- Experiments with Hypothetical Document Embeddings (HyDE):
65-hyde
(🚧 this is a work in progress) - MCP Client:
67-mcp
- How to use DeepSeek R1 (
1.5b
):68-deepseek-r1
Error management
ModelNotFoundError
// package completion
type ModelNotFoundError struct {
Code int
Message string
Model string
}
Usage:
answer, err := completion.Chat(ollamaUrl, query)
if err != nil {
// test if the model is not found
if modelErr, ok := err.(*completion.ModelNotFoundError); ok {
fmt.Printf("💥 Got Model Not Found error: %s\n", modelErr.Message)
fmt.Printf("😡 Error code: %d\n", modelErr.Code)
fmt.Printf("🧠 Expected Model: %s\n", modelErr.Model)
} else {
log.Fatal("😡:", err)
}
}
See these examples:
04-chat-stream
and66-structured-outputs
NoSuchOllamaHostError
// package completion
type NoSuchOllamaHostError struct {
Host string
Message string
}
Usage:
if noHostErr, ok := err.(*completion.NoSuchOllamaHostError); ok {
fmt.Printf("🦙 Got No Such Ollama Host error: %s\n", noHostErr.Message)
fmt.Printf("🌍 Expected Host: %s\n", noHostErr.Host)
}
First MCP support
Integration of github.com/mark3labs/mcp-go/mcp
and github.com/mark3labs/mcp-go/client
(this is a work in progress 🚧)
Helpers
mcphelpers.GetMCPClient(ctx context.Context, command string, env []string, args ...string) (*client.StdioMCPClient, *mcp.InitializeResult, error)
mcphelpers.GetTools(mcpClient *client.StdioMCPClient) ([]llm.Tool, error)
tools.ConvertMCPTools
to convert the MCP tools list to a list compliant with Ollama LLM tools. (used byGetTools
)mcphelpers.CallTool(ctx context.Context, mcpClient *client.StdioMCPClient, functionName string, arguments map[string]interface{}) (*mcp.CallToolResult, error)
mcphelpers.GetTextFromResult(mcpResult *mcp.CallToolResult) (string, error)
See this example:
67-mcp
(an example of a MCP server is provided)
Error management (specific type errors)
MCPClientCreationError
MCPClientInitializationError
MCPGetToolsError
MCPToolCallError
MCPResultExtractionError
v0.2.3 🥧 [pie]
Update of the Extism dependency.
v0.2.2 🧁 [cupcake]
What's new in v0.2.2?
Flock Agents
Inspired by: Swarm by OpenAI
Flock is a Parakeet package for creating and managing AI agents using the Ollama backend. It provides a simple way to create conversational agents, orchestrate interactions between them, and implement function calling capabilities.
Example
agent := flock.Agent{
Name: "Bob",
Model: "qwen2.5:3b",
OllamaUrl: "http://localhost:11434",
Options: llm.SetOptions(map[string]interface{}{
option.Temperature: 0.7,
option.TopK: 40,
option.TopP: 0.9,
}),
}
// Setting static instructions
agent.SetInstructions("Help the user with their queries.")
// Setting dynamic instructions with context
agent.SetInstructions(func(contextVars map[string]interface{}) string {
userName := contextVars["userName"].(string)
return fmt.Sprintf("Help %s with their queries.", userName)
})
orchestrator := flock.Orchestrator{}
response, _ := orchestrator.Run(
agent,
[]llm.Message{
{Role: "user", Content: "Hello, what's the best pizza?"},
},
map[string]interface{}{"userName": "Sam"},
)
fmt.Println(response.GetLastMessage().Content)
v0.2.1 🧇 [waffle]
What's new in v0.2.1?
Contextual Retrieval
Inspired by: Introducing Contextual Retrieval
2 new methods are available in the content
package:
CreateChunkContext
CreateChunkContextWithPromptTemplate
CreateChunkContext
generates a succinct context for a given chunk within the whole document content.
This context is intended to improve search retrieval of the chunk.
CreateChunkContextWithPromptTemplate
generates a contextual response based on a given prompt template and document content.
It interpolates the template with the provided document and chunk content, then uses an LLM to generate a response.
UI Helpers
2 new methods are available in the ui
package:
If you use Parakeet to create CLI applications, you can use the ui
package to create a (very) simple UI.
Input
Println
Input
displays a prompt with the specified color and waits for user input.
Println
prints the provided strings with the specified color using the lipgloss styling library.
CLI Helpers
8 new methods are available in the cli
package:
Settings
parses command-line arguments and flags.FlagValue
retrieves the value of a flag by its name from a slice of Flag structs.HasArg
checks if an argument with the specified name exists in the provided slice of arguments.HasFlag
checks if a flag with the specified name exists in the provided slice of flags.ArgsTail
extracts the names from a slice of Arg structs and returns them as a slice of strings.FlagsTail
takes a slice of Flag structs and returns a slice of strings containing the names of those flags.FlagsWithNamesTail
takes a slice of Flag structs and returns a slice of strings, where each string is a formatted pair of the flag's name and value in the form "name=value".HasSubsequence
checks if the given subsequence of strings (subSeq) is present in the tail of the provided arguments (args).
Example:
// default values
ollamaUrl := "http://localhost:11434"
chatModel := "llama3.1:8b"
embeddingsModel := "bge-m3:latest"
args, flags := cli.Settings()
if cli.HasFlag("url", flags) {
ollamaUrl = cli.FlagValue("url", flags)
}
if cli.HasFlag("chat-model", flags) {
chatModel = cli.FlagValue("chat-model", flags)
}
if cli.HasFlag("embeddings-model", flags) {
embeddingsModel = cli.FlagValue("embeddings-model", flags)
}
switch cmd := cli.ArgsTail(args); cmd[0] {
case "create-embeddings":
fmt.Println(embeddingsModel)
case "chat":
fmt.Println(chatModel)
default:
fmt.Println("Unknown command:", cmd[0])
}
New samples
- 52-constraints: Preventing an LLM from talking about certain things
- 53-constraints: Preventing an LLM from talking about certain things
- 54-constraints-webapp: Preventing an LLM from talking about certain things
- 55-create-npc: Create a NPC with
nemotron-mini
and chat with him - 56-jean-luc-picard: Chat with Jean-Luc Picard
- 57-jean-luc-picard-rag: Chat with Jean-Luc Picard + RAG
- 58-michael-burnham: Chat with Michael Burnham
- 59-jean-luc-picard-contextual-retrieval: Chat with Jean-Luc Picard + Contextual Retrieval
- 60-safety-models: Safety Models fine-tuned for content safety classification of LLM inputs and responses
v0.2.0 🍕 [pizza]
What's new in v0.2.0?
New way to set the options
Problem:
The omitempty
tag prevents a field from being serialised if its value is the zero value for the field's type (e.g., 0.0 for float64).
That means when Temperature
equals 0.0
, the field is not serialised (then Ollama will use the Temperature
default value, which equals 0.8
).
The problem will happen for every value equal to 0
or 0.0
Solution(s):
Set all the fields:
options := Options{
NumPredict: -1,
NumKeep: 4,
Temperature: 0.8,
TopK: 40,
TopP: 0.9,
TFSZ: 1.0,
TypicalP: 1.0,
RepeatLastN: 64,
RepeatPenalty: 1.1,
PresencePenalty: 0.0,
FrequencyPenalty: 0.0,
Mirostat: 0,
MirostatTau: 5.0,
MirostatEta: 0.1,
PenalizeNewline: true,
Seed: -1,
}
Default Options + overriding:
options := llm.DefaultOptions()
// override the default value
options.Temperature = 0.5
Use the SetOptions
helper:
Define only the fields you want to override:
options := llm.SetOptions(map[string]interface{}{
"Temperature": 0.5,
})
The SetOptions
helper will set the default values for the fields not defined in the map.
Or use the SetOptions
helper with the option
enums:
options := llm.SetOptions(map[string]interface{}{
option.Temperature: 0.5,
option.RepeatLastN: 2,
})
Note: the results should be more accurate.
New sample.
51-genai-webapp
: GenAI web application demo