Conversation Lifecycle Controller for Go. Orchestrate. Don't Implement.

Manage LLM-powered chat interactions with clean separation between conversation plumbing and reasoning logic. History, turn-taking, and streaming handled for you — your processor handles the brain.

Get Started
import "github.com/zoobz-io/chit"

// Your reasoning logic — any LLM, any strategy
processor := chit.ProcessorFunc(func(
    ctx context.Context,
    input string,
    session *zyn.Session,
) (chit.Result, error) {
    // Internal reasoning stays internal
    response, _ := llm.Complete(ctx, session, input)
    return &chit.Response{Content: response}, nil
})

// Dual-channel emitter — text + structured data
emitter := chit.EmitterFunc(
    func(text string) { stream(text) },       // Emit: conversational text
    func(resource any) { pushToUI(resource) }, // Push: structured resources
)

// Chat handles state, history, streaming
chat := chit.New(processor, emitter)
chat.Handle(ctx, "What's the status of order #42?")

// Multi-turn with Yield — no callback hell
processor = chit.ProcessorFunc(func(ctx context.Context, input string, session *zyn.Session) (chit.Result, error) {
    return &chit.Yield{
        Content: "I need your email to proceed.",
        Continue: func(ctx context.Context, reply string, s *zyn.Session) (chit.Result, error) {
            return &chit.Response{Content: "Confirmed: " + reply}, nil
        },
    }, nil
})
89%Test Coverage
A+Go Report

Why Chit?

Thin orchestration that separates conversation plumbing from reasoning logic.

Clean History Separation

User-facing conversation stays unpolluted. Internal LLM reasoning, retries, and tool calls are the processor's business.

Yield/Continue Turn-Taking

Multi-step workflows without callback hell. Return a Yield with a Continuation for natural multi-turn flows.

Dual-Channel Emitter

Emit conversational text and push structured resources simultaneously. Rich UIs get typed data alongside the conversation.

Pipeline-Native Reliability

Built on pipz for composable retry, timeout, rate limiting, and circuit breaker. Continuations get the same reliability wrappers.

Observable Lifecycle

Capitan signals for ChatCreated, InputReceived, ProcessingStarted, ProcessingCompleted, ResponseEmitted, TurnYielded, TurnResumed.

Bring Your Own LLM

Works with any LLM client via the Processor interface. Cogito, OpenAI SDK, or your own — chit doesn't care.

Capabilities

Conversation orchestration with pluggable reasoning, multi-turn flows, and built-in reliability.

FeatureDescriptionLink
Processor InterfacePluggable reasoning logic. ProcessorFunc for simple cases, full interface for complex workflows. LLM-agnostic.Concepts
Multi-Turn WorkflowsYield/Continue pattern for natural conversation branching. Continuations run through the same reliability pipeline.Architecture
Reliability PatternsRetry with backoff, timeout protection, circuit breakers, and rate limiting via pipz composition.Reliability
Session IntegrationBuilt on zyn sessions for typed conversation history. Transactional updates — session only changes on success.Concepts
TestingMock processors and emitters for deterministic tests. Processors testable in isolation without chat orchestration.Testing
TroubleshootingCommon issues with processor wiring, emitter setup, and continuation state management.Troubleshooting

Articles

Browse the full chit documentation.

Learn

OverviewConversation lifecycle management for LLM-powered applications
QuickstartGet up and running with chit in minutes
ConceptsCore abstractions in chit
ArchitectureHow chit works internally

Guides

Testing GuideHow to test code that uses chit
Troubleshooting GuideCommon issues and solutions when using chit
Reliability GuideAdding retry, timeout, rate limiting, and middleware to chit

Reference

API ReferenceComplete function documentation for chit
Types ReferenceComplete type documentation for chit