Skip to content

mkozhukh/echo

Repository files navigation

Echo - Simple LLM Client Library

A lightweight Go library for interacting with various LLM providers with a simple, unified API.

Supported Providers

  • OpenAI
  • Anthropic
  • Google
  • OpenRouter (via OpenAI-compatible API)
  • xAI (Grok)
  • Local CLI tools (Claude Code, Codex, Gemini) - opt-in, see Local CLI Providers

Installation

go get github.com/mkozhukh/echo

Quick Start

Universal Client

The NewCommonClient function creates a client that auto-configures providers from API keys:

package main

import (
    "context"
    "fmt"
    "github.com/mkozhukh/echo"
)

func main() {
    ctx := context.Background()

    // Create client with explicit API keys
    client, err := echo.NewCommonClient(map[string]string{
        "openai": "your-openai-key",
    }, echo.WithModel("openai/gpt-5"))
    if err != nil {
        panic(err)
    }

    // Simple call using QuickMessage helper
    resp, err := client.Complete(ctx, echo.QuickMessage("Hello, how are you?"))
    if err != nil {
        panic(err)
    }
    fmt.Println(resp.Text)
}

Pass nil as keys to auto-detect API keys from environment variables:

client, err := echo.NewCommonClient(nil, echo.WithModel("openai/gpt-5"))

Provider-Specific Clients

If you only need a single provider, use the dedicated constructors:

// OpenAI
client := echo.NewOpenAIClient("your-api-key", "gpt-5")

// Anthropic
client := echo.NewAnthropicClient("your-api-key", "claude-sonnet-4-5")

// Google
client := echo.NewGoogleClient("your-api-key", "gemini-2.5-pro")

// xAI (Grok)
client := echo.NewXAIClient("your-api-key", "grok-4-0709")

// Voyage AI (embeddings & reranking)
client := echo.NewVoyageClient("your-api-key", "voyage-4-large")

These accept the same CallOption options as NewCommonClient:

client := echo.NewOpenAIClient("your-api-key", "gpt-5",
    echo.WithSystemMessage("You are a helpful assistant."),
    echo.WithTemperature(0.7),
)

Model Aliases

Use convenient aliases instead of full model names:

// Quality tiers available for each provider:
// - best: Highest quality model
// - balanced: Good balance of quality and speed
// - light: Fast and economical

client, _ := echo.NewCommonClient(nil, echo.WithModel("openai/best"))      // Uses gpt-5.2
client, _ := echo.NewCommonClient(nil, echo.WithModel("anthropic/balanced")) // Uses claude-opus-4-5
client, _ := echo.NewCommonClient(nil, echo.WithModel("google/light"))      // Uses gemini-2.5-flash
client, _ := echo.NewCommonClient(nil, echo.WithModel("xai/best"))          // Uses grok-4-0709

Environment Variables

The library supports flexible environment variable configuration:

// Set default model and API key
os.Setenv("ECHO_MODEL", "anthropic/balanced")
os.Setenv("ECHO_KEY", "your-api-key")

// Create client without parameters - uses env vars
client, _ := echo.NewCommonClient(nil)

// Or use provider-specific API keys
os.Setenv("OPENAI_API_KEY", "your-openai-key")
os.Setenv("ANTHROPIC_API_KEY", "your-anthropic-key")
os.Setenv("GOOGLE_API_KEY", "your-google-key")
os.Setenv("XAI_API_KEY", "your-xai-key")

// API key is automatically selected based on provider
client, _ := echo.NewCommonClient(nil, echo.WithModel("openai/gpt-5"))

Message Chains

The library supports three ways to create message chains for conversations:

1. QuickMessage - Simple Single Messages

For basic single-message prompts:

resp, _ := client.Complete(ctx, echo.QuickMessage("Tell me a joke"))

2. TemplateMessage - Multi-Message Templates

For readable multi-turn conversations using a text template:

messages := echo.TemplateMessage(`
@system:
You are a helpful math tutor.

@user:
What is 2+2?

@agent:
2+2 equals 4.

@user:
Can you explain why?
`)

resp, err := client.Complete(ctx, messages)

Template format:

  • @role: markers separate messages (system, user, agent)
  • Content follows until the next marker or end of template
  • Content can be on the same line: @user: Hello there!
  • Multiline content is supported
  • Whitespace is automatically trimmed

3. Manual Message Construction

For programmatic message building:

messages := []echo.Message{
    {Role: echo.System, Content: "You are a helpful assistant."},
    {Role: echo.User, Content: "Hello"},
    {Role: echo.Agent, Content: "Hi! How can I help you today?"},
    {Role: echo.User, Content: "What's the weather like?"},
}

resp, err := client.Complete(ctx, messages)

Message Roles

  • echo.System - System instructions (must be first if present, only one allowed)
  • echo.User - User messages
  • echo.Agent - Assistant/model messages (maps to "assistant" for OpenAI/Anthropic, "model" for Gemini)

Options and Configuration

Client Creation with Options

// Set defaults at client creation time
client, _ := echo.NewCommonClient(nil,
    echo.WithModel("google/best"),
    echo.WithSystemMessage("You are a creative assistant."),
    echo.WithTemperature(0.8),
)

// Use client defaults
resp, _ := client.Complete(ctx, echo.QuickMessage("Tell me a joke"))

// Override defaults for specific calls
resp, _ = client.Complete(ctx, echo.QuickMessage("Write a formal email"),
    echo.WithTemperature(0.2), // More deterministic
)

Dynamic Provider Switching

The library supports switching providers on a per-call basis using WithModel:

// Create a client with a default provider
client, _ := echo.NewCommonClient(nil, echo.WithModel("openai/gpt-4"))

// Use different providers for different calls
resp1, _ := client.Complete(ctx, echo.QuickMessage("Analyze this text"),
    echo.WithModel("anthropic/claude-3.5-sonnet"), // Use Anthropic for analysis
)

resp2, _ := client.Complete(ctx, echo.QuickMessage("Generate an image description"),
    echo.WithModel("google/gemini-2.5-pro"), // Use Google for creative tasks
)

resp3, _ := client.Complete(ctx, echo.QuickMessage("Quick calculation"),
    echo.WithModel("openai/gpt-5-mini"), // Use a lighter model for simple tasks
)

Per-Call Options

resp, err := client.Complete(ctx, echo.QuickMessage("Write a story"),
    echo.WithTemperature(0.7),
    echo.WithMaxTokens(100),
    echo.WithSystemMessage("You are a creative writer."),
)

Available Options

  • WithModel(string) - Override model for this call
  • WithTemperature(float32) - Control randomness (0.0 - 1.0)
  • WithMaxTokens(int) - Limit response length
  • WithSystemMessage(string) - Set or override system prompt (overrides any system message in the message chain)
  • WithBaseURL(string) - Override the API base URL (useful for custom endpoints)
  • WithEndPoint(string) - Specify endpoint routing (primarily for OpenRouter provider selection)
  • WithStoreData(bool) - Control server-side storage (xAI only, defaults to false for privacy)

Streaming Responses

For real-time streaming of responses, use the StreamComplete method:

Basic Streaming

streamResp, err := client.StreamComplete(ctx, echo.QuickMessage("Write a short story"))
if err != nil {
    panic(err)
}

StreamComplete vs Complete

  • Complete: Returns complete response after generation finishes
  • StreamComplete: Returns chunks as they're generated for real-time display

Both methods support the same options

Using in Tests

The "mock" provider can be used for tests, it will return combined string of all incoming messages

client, _ := echo.NewCommonClient(nil, echo.WithModel("mock/any"))
mockResp, err := client.Complete(ctx, echo.QuickMessage("test"))
if err != nil {
    panic(err)
}
// outputs: `[user]: test`

Using OpenRouter

OpenRouter provides access to multiple LLM providers through a single API:

// Basic usage with any OpenRouter model
client, _ := echo.NewCommonClient(map[string]string{
    "openrouter": "your-openrouter-key",
}, echo.WithModel("openrouter/claude-3.5-sonnet"))

you can specify which underlying provider infrastructure to use:

// Specify provider routing with @ syntax in model name
client, _ := echo.NewCommonClient(nil, echo.WithModel("openrouter/claude-3.5-sonnet@aws"))

// Multiple providers for fallback (comma-separated)
client, _ := echo.NewCommonClient(nil, echo.WithModel("openrouter/gpt-4@azure,openai"))

Using xAI (Grok)

xAI provides access to Grok models:

// Basic usage
client, _ := echo.NewCommonClient(map[string]string{
    "xai": "your-xai-key",
}, echo.WithModel("xai/grok-4-0709"))

// Server-side storage is disabled by default for privacy
// To explicitly enable storage:
resp, _ := client.Complete(ctx, echo.QuickMessage("Hello"),
    echo.WithStoreData(true),
)

Local CLI Providers

If you have an agentic CLI installed locally (Claude Code, OpenAI Codex, Gemini CLI) and want to reuse its subscription for plain text-in/text-out completions, the library can route requests through the local binary instead of hitting a remote API. The CLI is launched in non-interactive mode, the assembled prompt is piped in, and stdout is returned as the response.

Because this executes local binaries, CLI providers are not registered automatically. You must opt in with an explicit call:

client, _ := echo.NewCommonClient(nil)
echo.EnableLocalCLI(client) // registers claude-cli, codex-cli, gemini-cli

resp, err := client.Complete(ctx, echo.QuickMessage("explain channels"),
    echo.WithModel("claude-cli/opus"),
)

Model naming follows the usual provider/model convention, where the second segment is forwarded to the CLI as the model flag:

  • claude-cli/<model> - runs claude -p --output-format text --model <model>
  • codex-cli/<model> - runs codex exec -m <model> <prompt>
  • gemini-cli/<model> - runs gemini -p -m <model> <prompt>

Binary paths can be overridden via environment variables when the tools aren't on PATH or a specific version is required:

  • ECHO_CLAUDE_CLI_PATH
  • ECHO_CODEX_CLI_PATH
  • ECHO_GEMINI_CLI_PATH

Only the WithModel and WithSystemMessage options have any effect on a CLI call - the CLIs do not expose a uniform way to set temperature, max tokens, or structured output, so those options are silently ignored. Embeddings and reranking are likewise unsupported.

For full control (custom flags, alternative binaries) instantiate CLIProvider directly and register it with any provider name you like:

client.SetProvider("claude-cli", &echo.CLIProvider{
    Binary:    "/opt/claude/claude",
    ModelFlag: "--model",
    ExtraArgs: []string{"-p", "--output-format", "text"},
})

License

MIT

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages