Skip to content

Blankeos/modelcli

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

20 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

โ–ˆโ–ˆโ–ˆโ•—โ–‘โ–‘โ–‘โ–ˆโ–ˆโ–ˆโ•—โ–‘โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–‘โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–‘โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•—โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–‘โ–ˆโ–ˆโ•—โ–‘โ–‘โ–‘โ–‘โ–‘โ–ˆโ–ˆโ•—
โ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–‘โ–ˆโ–ˆโ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•”โ•โ•โ•โ•โ•โ–ˆโ–ˆโ•‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–ˆโ–ˆโ•”โ•โ•โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–ˆโ–ˆโ•‘
โ–ˆโ–ˆโ•”โ–ˆโ–ˆโ–ˆโ–ˆโ•”โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘โ–‘โ–‘โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘โ–‘โ–‘โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–‘โ–‘โ–ˆโ–ˆโ•‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–ˆโ–ˆโ•‘โ–‘โ–‘โ•šโ•โ•โ–ˆโ–ˆโ•‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–ˆโ–ˆโ•‘
โ–ˆโ–ˆโ•‘โ•šโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘โ–‘โ–‘โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•‘โ–‘โ–‘โ–ˆโ–ˆโ•‘โ–ˆโ–ˆโ•”โ•โ•โ•โ–‘โ–‘โ–ˆโ–ˆโ•‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–ˆโ–ˆโ•‘โ–‘โ–‘โ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–ˆโ–ˆโ•‘
โ–ˆโ–ˆโ•‘โ–‘โ•šโ•โ•โ–‘โ–ˆโ–ˆโ•‘โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ•šโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•”โ•โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ•—โ–ˆโ–ˆโ•‘
โ•šโ•โ•โ–‘โ–‘โ–‘โ–‘โ–‘โ•šโ•โ•โ–‘โ•šโ•โ•โ•โ•โ•โ–‘โ•šโ•โ•โ•โ•โ•โ•โ–‘โ•šโ•โ•โ•โ•โ•โ•โ•โ•šโ•โ•โ•โ•โ•โ•โ•โ–‘โ•šโ•โ•โ•โ•โ•โ–‘โ•šโ•โ•โ•โ•โ•โ•โ•โ•šโ•โ•

modelcli

Call any LLM from the command line via models.dev.

  • API-key based models โ€” works with any provider that uses standard API keys (OpenAI, Anthropic, Google, xAI, Groq, Mistral, etc.)
  • Some subscription models work โ€” as long as they use an API Key (Z.AI's GLM, Minimax, etc.)
  • No OAuth subscription models โ€” does not support models behind OAuth login (e.g. ChatGPT Plus/Pro, Codex subscriptions)

Install

npm install -g @blankeos/modelcli  # npm
bun install -g @blankeos/modelcli  # or bun
cargo binstall modelcli  # or cargo-binstall (prebuilt binary, faster)
cargo install modelcli   # or cargo (build from source)
curl -sSL https://raw.githubusercontent.com/Blankeos/modelcli/main/install.sh | sh # or linux/macos (via curl)

Quick Start

# 1. Connect to a provider (Any known provider thanks to models.dev)
modelcli connect

# 2. Browse models and set a default
modelcli models

# 3. Send a prompt
modelcli "What is the meaning of life?"

Usage

modelcli [OPTIONS] [PROMPT]

Commands

Command Description
connect Connect to a provider (add API key)
models Browse and manage models

Options

Flag Description
--model <provider/model-id> Model to use (overrides default)
--stream Stream tokens as they arrive
--thinking Show thinking/reasoning tokens
--reasoning-effort <level> Reasoning effort: low, medium, or high
--format json Output raw JSON instead of human-readable text

Examples

# Use a specific model
modelcli --model openai/gpt-4o "Explain quicksort"

# Stream the response
modelcli --stream "Write a haiku about Rust"

# Enable reasoning
modelcli --thinking --reasoning-effort high "Prove that โˆš2 is irrational"

# JSON output
modelcli --format json "Hello"

Custom Providers

You can add any OpenAI-compatible provider not listed on models.dev.

1. Add a credential:

modelcli connect
# Select "Other (custom provider)" โ†’ enter a provider ID and API key

2. Configure the provider in ~/.config/modelcli.jsonc:

{
  "provider": {
    "myprovider": {
      "name": "My AI Provider",
      "baseURL": "https://api.myprovider.com/v1",
      "models": {
        "my-model": {
          "name": "My Model", // optional display name
          "reasoning": false, // optional, default false
          "context": 200000, // optional context window
          "output": 65536, // optional max output tokens
        },
      },
    },
  },
}

Then use it like any other model:

modelcli --model myprovider/my-model "Hello!"

The config file is auto-created the first time you add a custom provider. Both .jsonc and .json are supported, but not both at the same time.

Data Storage

  • Credentials and app data: ~/.local/share/modelcli/
  • Custom provider config: ~/.config/modelcli.jsonc

Motivation

modelcli enables piping LLM calls directly from your terminalโ€”perfect for generating commit messages in lazygit (see PR #5389), or powering any other CLI app with AI capabilities. Quickly ask questions or pipe stdout from other tools to get instant AI-powered responses.

Inspired by OpenCode's seamless multi-provider experience and built on models.dev's unified LLM API.

๐Ÿฆ€ Made w/ Rust. A fast, minimal but intuitive CLI made with Rust.

About

๐Ÿง  Call any LLM from the command line via models.dev.

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors