Official client libraries for the Koder AI Gateway API. Integrate AI capabilities into your applications with idiomatic, type-safe SDKs for Go, TypeScript, Python, and Dart.
// Quick start with Go import "flow.koder.dev/koder/koder-ai-sdk/go" client := sdk.New("your-api-key") resp, _ := client.Chat.Complete(ctx, sdk.ChatRequest{ Model: "gpt-4o", Messages: []sdk.Message{ {Role: "user", Content: "Hello!"}, }, }) fmt.Println(resp.Choices[0].Message.Content)
Everything you need to integrate AI into your applications, with best practices baked in.
Full type definitions in every language. Catch errors at compile time, not at runtime. Auto-complete works out of the box in your IDE.
Real-time streaming with idiomatic iterators in each language. Go channels, async generators in TS, async iterators in Python, Dart Streams.
Specific error types for auth failures, rate limits, validation errors, and server errors. No more guessing what went wrong.
Drop-in replacement for the OpenAI SDK. Just change the base URL and API key. Switch providers without touching your business logic.
First-class SDKs for Go, TypeScript, Python, and Dart. Same capabilities, idiomatic APIs. Use the language your team knows best.
Automatic retries with exponential backoff for transient errors. Configurable retry count, backoff strategy, and retry-on conditions.
Idiomatic Go SDK with full type safety, context support, and streaming via channels.
go get flow.koder.dev/koder/koder-ai-sdk/go// Streaming chat completion in Go stream, _ := client.Chat.Stream(ctx, sdk.ChatRequest{ Model: "gpt-4o", Messages: []sdk.Message{ {Role: "user", Content: "Write a haiku"}, }, }) for { chunk, err := stream.Recv() if err == io.EOF { break } fmt.Print(chunk.Choices[0].Delta.Content) }
Modern TypeScript SDK with async/await, full generics, and tree-shakeable ESM imports.
npm install @koder/ai-sdk// Streaming chat completion in TypeScript const client = new KoderAI("your-api-key"); const stream = await client.chat.stream({ model: "gpt-4o", messages: [{ role: "user", content: "Write a haiku" }], }); for await (const chunk of stream) { process.stdout.write(chunk.choices[0].delta.content); }
Async-first Python SDK with Pydantic models, type hints, and native asyncio support.
pip install koder-ai-sdk# Streaming chat completion in Python async with KoderAI("your-api-key") as client: stream = await client.chat.stream( model="gpt-4o", messages=[{"role": "user", "content": "Write a haiku"}], ) async for chunk in stream: print(chunk.choices[0].delta.content, end="")
Flutter-ready Dart SDK with strong typing, null safety, and native Stream support.
dart pub add koder_ai_sdk// Streaming chat completion in Dart final client = KoderAI('your-api-key'); final stream = client.chat.stream( ChatCompletionRequest( model: 'gpt-4o', messages: [Message(role: 'user', content: 'Write a haiku')], ), ); await for (final chunk in stream) { stdout.write(chunk.choices[0].delta.content); }
See how Koder AI SDK stacks up against other AI client libraries.
| Feature | Koder AI SDK | OpenAI SDK | Anthropic SDK | Google AI SDK | Direct HTTP |
|---|---|---|---|---|---|
| Multi-provider support | ✓ | — | — | — | ✓ |
| Go SDK | ✓ | ✓ | — | ✓ | Manual |
| TypeScript SDK | ✓ | ✓ | ✓ | ✓ | Manual |
| Python SDK | ✓ | ✓ | ✓ | ✓ | Manual |
| Dart / Flutter SDK | ✓ | — | — | — | Manual |
| Streaming support | ✓ | ✓ | ✓ | ✓ | Manual |
| Auto-retry with backoff | ✓ | ✓ | ✓ | Partial | — |
| OpenAI-compatible API | ✓ | ✓ | — | — | ✓ |
| Open-source | ✓ | ✓ | ✓ | ✓ | N/A |
Yes. The Koder AI Gateway exposes an OpenAI-compatible API. You can use these SDKs as a drop-in replacement for the official OpenAI libraries by pointing them to the Koder AI Gateway endpoint.
The available models depend on your Koder AI Gateway configuration. Use the Models.list() method to discover all models available on your instance, including models from OpenAI, Anthropic, Google, Meta, and Mistral.
Yes. All four SDKs support streaming with idiomatic patterns: Go uses a Recv() loop, TypeScript uses async generators, Python uses async iterators, and Dart uses Streams.
Yes. All SDKs accept a custom HTTP client for advanced use cases like custom TLS certificates, proxies, or logging middleware. You can bring your own transport layer.
Yes. The Koder AI SDK is released under the MIT License. Contributions are welcome on the source repository.
Install the Koder AI SDK, change your base URL to the Koder AI Gateway endpoint, and swap your API key. The request and response formats are identical to OpenAI's, so your existing code works as-is.
Four languages. One API. Type-safe, streaming, production-ready.