Open Source · Rust

AI agents that work together.
Defined in config. Connected by mesh.

One TOML file = a complete autonomous agent with LLM, tools, scheduling, memory, and mesh networking. Share providers without sharing API keys. Run on device. No cloud, no Python, no credential leaks.

Also available as a Rust library, CLI, VS Code extension, and iOS/Android FFI.

Install in one command

curl -sSf https://query.mt/install.sh | sh

macOS Silicon: clear the quarantine flag after downloading:  xattr -dr com.apple.quarantine qmtcode
Rust developers: add to Cargo.tomlquerymt = { version = "0.2", features = ["extism_host"] }

Five pillars that make querymt unique

Every feature maps to one of these pillars. Together they form a framework no one else offers.

Three steps from config to running agent

1

Write a TOML file

[agent]
provider = "anthropic"
model = "..."
tools = ["shell", "knowledge_ingest"]

[[mcp]]
name = "github"

[[middleware]]
type = "limits"
2

Run one command

qmtcode --config agent.toml --dashboard
Agent bootstraps
Loads MCP servers
Creates knowledge store
Starts scheduler
Joins mesh
3

Use the dashboard

chat > 💬 Chat with it
cron > 📅 Schedule it
tail > 👁 Watch it run
mode > 🔄 Switch modes

17+ providers, one API

OCI-pulled WASM or native plugins. Sandboxed by default. GPU auto-detection for local inference.

$ qmt providers list 17 adapters
provider-registry status: plugin-ready
[01] plugin: OpenAI ready
[02] plugin: Anthropic ready
[03] plugin: Google ready
[04] plugin: Groq ready
[05] plugin: xAI ready
[06] plugin: Ollama ready
[07] plugin: llama.cpp ready
[08] plugin: Mistral ready
[09] plugin: DeepSeek ready
[10] plugin: Alibaba ready
[11] plugin: Moonshot ready
[12] plugin: Kimi-Code ready
[13] plugin: OpenRouter ready
[14] plugin: Codex ready
[15] plugin: Z.AI ready
[16] plugin: Izwi ready
[17] plugin: MRS ready

Where do you fit in?