Alternative LLM Providers
Overview
Section titled “Overview”You can use Claude Code with alternative LLMs served
via Anthropic-compatible APIs. Each provider exposes
an endpoint that speaks the Anthropic /v1/messages
format, so Claude Code works without any code changes
— you just set a few environment variables.
The shell functions below use subshells ( ... )
so the environment variables do not leak into your
main shell session. This means you can run multiple
instances of Claude Code simultaneously, each using
a different LLM.
API Key Setup
Section titled “API Key Setup”Before using any provider, export the corresponding
API key in your shell (or add it to your
.zshrc / .bashrc):
export KIMI_API_KEY="your-kimi-key"export Z_API_KEY="your-z-key"export DEEPSEEK_API_KEY="your-deepseek-key"export MINIMAX_API_KEY="your-minimax-key"Shell Functions
Section titled “Shell Functions”Add the function for the provider(s) you want to use
to your ~/.zshrc or ~/.bashrc, then use it
exactly like the claude command.
Kimi K2 from Moonshot AI.
kimi() { ( export ANTHROPIC_BASE_URL=https://api.moonshot.ai/anthropic export ANTHROPIC_AUTH_TOKEN=$KIMI_API_KEY claude "$@" )}Usage:
kimi # start interactive sessionkimi -p "explain X" # one-shot promptGLM-4.5 from Zhipu AI (Z.AI).
zai() { ( export ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic export ANTHROPIC_AUTH_TOKEN=$Z_API_KEY claude "$@" )}Usage:
zai # start interactive sessionzai -p "explain X" # one-shot promptDeepSeek v3.1 chat model.
dseek() { ( export ANTHROPIC_BASE_URL=https://api.deepseek.com/anthropic export ANTHROPIC_AUTH_TOKEN=${DEEPSEEK_API_KEY} export ANTHROPIC_MODEL=deepseek-chat export ANTHROPIC_SMALL_FAST_MODEL=deepseek-chat claude "$@" )}DeepSeek requires explicitly setting
ANTHROPIC_MODEL and ANTHROPIC_SMALL_FAST_MODEL
because the default Claude model names are not
recognized by the DeepSeek API.
Usage:
dseek # start interactive sessiondseek -p "explain X" # one-shot promptMiniMax M2.1 from MiniMax.
ccmm() { ( export ANTHROPIC_BASE_URL=https://api.minimax.io/anthropic export ANTHROPIC_AUTH_TOKEN=$MINIMAX_API_KEY export API_TIMEOUT_MS=3000000 export CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC=1 export ANTHROPIC_MODEL=MiniMax-M2.1 export ANTHROPIC_SMALL_FAST_MODEL=MiniMax-M2.1 export ANTHROPIC_DEFAULT_SONNET_MODEL=MiniMax-M2.1 export ANTHROPIC_DEFAULT_OPUS_MODEL=MiniMax-M2.1 export ANTHROPIC_DEFAULT_HAIKU_MODEL=MiniMax-M2.1 claude "$@" )}MiniMax requires mapping all model slots to
MiniMax-M2.1 and setting a longer timeout
(API_TIMEOUT_MS) since responses can take longer.
Non-essential traffic is disabled to avoid
unnecessary requests.
Usage:
ccmm # start interactive sessionccmm -p "explain X" # one-shot promptHow It Works
Section titled “How It Works”Each function:
- Opens a subshell so env vars are scoped
- Sets
ANTHROPIC_BASE_URLto the provider’s Anthropic-compatible endpoint - Sets
ANTHROPIC_AUTH_TOKENto your API key - Optionally overrides model names (required for DeepSeek and MiniMax)
- Launches
claudewith any arguments you pass
Because everything runs in a subshell, your main
shell environment is unaffected. You can have one
terminal running kimi and another running zai
at the same time.
See Also
Section titled “See Also”- Local LLMs — run models locally with llama.cpp
- Chutes Integration — use Claude Code with the Chutes provider via a router