โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Call any LLM from the command line via models.dev.
- API-key based models โ works with any provider that uses standard API keys (OpenAI, Anthropic, Google, xAI, Groq, Mistral, etc.)
- Some subscription models work โ as long as they use an API Key (Z.AI's GLM, Minimax, etc.)
- No OAuth subscription models โ does not support models behind OAuth login (e.g. ChatGPT Plus/Pro, Codex subscriptions)
npm install -g @blankeos/modelcli # npm
bun install -g @blankeos/modelcli # or bun
cargo binstall modelcli # or cargo-binstall (prebuilt binary, faster)
cargo install modelcli # or cargo (build from source)
curl -sSL https://raw.githubusercontent.com/Blankeos/modelcli/main/install.sh | sh # or linux/macos (via curl)# 1. Connect to a provider (Any known provider thanks to models.dev)
modelcli connect
# 2. Browse models and set a default
modelcli models
# 3. Send a prompt
modelcli "What is the meaning of life?"modelcli [OPTIONS] [PROMPT]
| Command | Description |
|---|---|
connect |
Connect to a provider (add API key) |
models |
Browse and manage models |
| Flag | Description |
|---|---|
--model <provider/model-id> |
Model to use (overrides default) |
--stream |
Stream tokens as they arrive |
--thinking |
Show thinking/reasoning tokens |
--reasoning-effort <level> |
Reasoning effort: low, medium, or high |
--format json |
Output raw JSON instead of human-readable text |
# Use a specific model
modelcli --model openai/gpt-4o "Explain quicksort"
# Stream the response
modelcli --stream "Write a haiku about Rust"
# Enable reasoning
modelcli --thinking --reasoning-effort high "Prove that โ2 is irrational"
# JSON output
modelcli --format json "Hello"You can add any OpenAI-compatible provider not listed on models.dev.
1. Add a credential:
modelcli connect
# Select "Other (custom provider)" โ enter a provider ID and API key2. Configure the provider in ~/.config/modelcli.jsonc:
Then use it like any other model:
modelcli --model myprovider/my-model "Hello!"The config file is auto-created the first time you add a custom provider. Both
.jsoncand.jsonare supported, but not both at the same time.
- Credentials and app data:
~/.local/share/modelcli/ - Custom provider config:
~/.config/modelcli.jsonc
modelcli enables piping LLM calls directly from your terminalโperfect for generating commit messages in lazygit (see PR #5389), or powering any other CLI app with AI capabilities. Quickly ask questions or pipe stdout from other tools to get instant AI-powered responses.
Inspired by OpenCode's seamless multi-provider experience and built on models.dev's unified LLM API.
๐ฆ Made w/ Rust. A fast, minimal but intuitive CLI made with Rust.
{ "provider": { "myprovider": { "name": "My AI Provider", "baseURL": "https://api.myprovider.com/v1", "models": { "my-model": { "name": "My Model", // optional display name "reasoning": false, // optional, default false "context": 200000, // optional context window "output": 65536, // optional max output tokens }, }, }, }, }