Turn any MCP server into a CLI in one command. Cut the token cost of AI agents by 60–80%, every call, forever.
git clone https://github.com/cli-use/cli-use.git
cd cli-use
pip install -e .
cli-use add fs /tmp # install + register + emit SKILL.md
cli-use fs list_directory --path /tmpThat's it. The filesystem MCP server is now a native CLI — usable by agents, scripts, humans, and anything that can subprocess.run.
cli-usemakes MCP usable from any shell — at a fraction of the tokens.
Every MCP tool call burns tokens on three things an agent shouldn't pay for:
- Schema discovery — a verbose JSON schema per tool, loaded into context every session (~200–500 tok each).
- Request framing — JSON-RPC envelope + full argument JSON for every call.
- Response parsing — the server answers in JSON content blocks even when the useful part is one line of text.
Replace the whole chain with a terse CLI and you get native Unix composition and real savings.
| session size | MCP tokens | cli-use tokens | savings |
|---|---|---|---|
| 1 call | 2,118 | 440 | 79% |
| 5 calls | 2,425 | 565 | 77% |
| 20 calls | 3,577 | 1,033 | 71% |
| 100 calls | 9,721 | 3,529 | 64% |
Fixed discovery cost: 80% cheaper. Per-call I/O: 59% cheaper.
Reproduce with python examples/real_benchmark.py.
# 1) Install an MCP server as a cli-use alias
$ cli-use add fs /tmp
cli-use: emitted skill → skills/fs
cli-use: 'fs' installed (14 tools)
# 2) Call any tool like a regular CLI
$ cli-use fs list_directory --path /tmp
notes.md
report.pdf
...
# 3) Compose with the rest of Unix
$ cli-use fs search_files --path /tmp --pattern "*.md" | head
$ cli-use fs read_text_file --path /tmp/notes.md | grep TODOEvery add also drops a Vercel-format SKILL.md in skills/<alias>/ and a pointer in AGENTS.md, so any agent working in that repo learns the CLI automatically — zero human prompting required.
For interactive workflows, keep an MCP server hot in the background instead of paying the process-spawn tax on every call.
| Mode | Latency per call |
|---|---|
| One-shot (default) | ~300–500 ms |
| Daemon | < 10 ms |
Start a daemon for any alias:
$ cli-use daemon start fs
Daemon for fs started on 127.0.0.1:49231.From now on, every cli-use fs ... call reuses the same stdio connection automatically — no flags needed, no JSON-RPC re-handshake.
$ cli-use fs list_directory --path /tmp
notes.md
report.pdf
$ cli-use fs read_text_file --path /tmp/notes.md
# instant, no process spawnManage background processes:
$ cli-use daemon list
fs 127.0.0.1:49231
$ cli-use daemon stop fsIf the daemon is not running, commands fall back to the default one-shot behavior. The daemon is pure stdlib Python and talks HTTP over localhost — zero extra dependencies.
10 MCP servers are pre-wired — see them with cli-use list:
fs Read, write, list, and search files on disk.
memory Knowledge-graph memory: entities, relations, observations.
gh Interact with GitHub: issues, PRs, repos.
git Git log, diff, status, blame.
sqlite Query and inspect SQLite databases.
time Current time and timezone conversions.
fetch Fetch a URL and return HTML/markdown.
puppeteer Headless-browser automation.
brave Web search via Brave Search API.
slack Send messages, list channels, read history.
Install sources: npm, pip, pipx, local commands. Add your own in one line:
cli-use add my-server --from "npm:@acme/my-mcp-server"
cli-use add weather --from "pip:weather-mcp-server"
cli-use add dev-tools --from "local:python /opt/mcp/server.py"cli-use add fs /tmp # → cli-use fs <tool>Everything installed, cached, and exposed as cli-use <alias> <tool> --flag value.
# hello_cli.py
from cli_use import agent_tool, run_cli
@agent_tool
def greet(name: str, shout: bool = False) -> str:
"Greet someone by name."
msg = f"hello {name}"
return msg.upper() if shout else msg
if __name__ == "__main__":
run_cli(emit_skill=True, alias="hello")$ python hello_cli.py greet --name world
hello worldemit_skill=True drops a skills/hello/SKILL.md + AGENTS.md entry alongside the script, so your custom CLI is also self-documenting to agents.
Boolean flags support both forms: --flag and --no-flag.
| Protocol | Universal shell client |
|---|---|
| HTTP | curl |
| Kubernetes API | kubectl |
| Docker daemon | docker |
| MCP | cli-use |
- Zero runtime deps. Pure Python stdlib once installed from the repo.
- Terse
--helpby design. An agent learns a 14-tool CLI in ~400 tokens instead of ~2000. - Plain-text stdout. Pipes to
jq,grep,awk,xargswithout ceremony. - Persistent aliases. Install once, every project in your shell inherits it.
- Skills by default. Agents pick up the CLI from SKILL.md/AGENTS.md with no hand-holding.
- Smoke-tested on Linux, macOS, and Windows.
Repeated identical calls are served from disk (TTL 5 min), bypassing even the daemon:
# First call — goes to MCP server
cli-use fs list_directory --path /tmp
# Second call — instant, from disk cache
cli-use fs list_directory --path /tmpCache stored in ~/.cli-use/cache/. Zero config.
Run a JSON pipeline across multiple aliases, with {{out:N}} substitution between steps:
$ cat ops.json
[
{"alias": "fs", "tool": "read_file", "arguments": {"path": "report.md"}},
{"alias": "gh", "tool": "create_issue", "arguments": {"title": "Bug", "body": "{{out:0}}"}}
]
$ cli-use batch ops.json
$ cli-use batch ops.json --format json # structured output
$ cli-use batch ops.json --continue-on-errorGenerate an OpenAPI 3.0 spec from any cached alias:
$ cli-use openapi fs
$ cli-use openapi fs gh --out api_spec.jsonGenerate bash completion scripts dynamically from cached tool schemas:
$ source <(cli-use completions --shell bash)
$ cli-use fs <TAB> # autocompletes tool names
$ cli-use fs read_file <TAB> # autocompletes --flagscli-use convert "npx -y @modelcontextprotocol/server-filesystem /tmp" \
--out ./fs-cli.py --emit-skill --alias fs
cli-use run "python mock_mcp.py" greet --arguments '{"name":"YC"}'
cli-use mcp-list "python mock_mcp.py" --format json- v0.1 — one-shot CLI, built-in registry, SKILL.md / AGENTS.md emission.
- v0.2 — daemon mode (keep MCP hot, <10 ms per call).
- v0.3 — Call caching, batch/pipe mode across aliases, OpenAPI export, shell autocomplete.
- v0.4 — coming soon.
Alpha. Python 3.10+. Zero runtime deps.
MIT © 2026 cli-use contributors.
