Every developer knows the small friction that compounds into lost momentum. You need a priority fee sample for an address touching Raydium, so you tab over to the dashboard and click through menus. Your AI agent invents gRPC field names from outdated Yellowstone protos because it hasn't seen Jetstream's schema. These aren't hard problems individually—they're just exhausting in aggregate. OrbitFlare addressed this with three new entry points announced today: a Rust CLI (v0.3), an MCP server exposing 53 discrete tools, and AI agent skills that drop OrbitFlare specifics directly into your editor's context window.
The orbitflare CLI
orbitflare is a single binary installable via cargo install orbitflare. It surfaces the full OrbitFlare surface area—RPC queries, WebSocket subscriptions, gRPC streaming, project scaffolding, auth management, plan browsing, on-chain top-ups in USDC, and a TUI dashboard—all from the terminal. Credentials land in your OS keychain automatically, so you don't thread API keys through every command. The rpc subcommand handles one-shot reads: slot height, wallet balances, token lists, transaction lookups, and history with configurable limits. The ws subcommand mirrors Solana's four pubsub subscription types—slot updates, account changes, program logs filtered by mentions, and signature confirmations—all with --json output for piping into downstream tools. Underneath both transports sits the published orbitflare-sdk crate, which brings multi-endpoint failover, exponential retry with jitter, ping/pong liveness checks, automatic reconnection, and URL sanitization that strips API keys from error messages and tracing spans. For heavier workloads, jet and grpc subcommands stream events via Jetstream and Yellowstone gRPC respectively, both accepting YAML configs with ${ENV_VAR} interpolation. The template --install solana-copy-trader command scaffolds example projects wired end-to-end to OrbitFlare's stack. Account management commands (plan list, pay check-balance, pay topup) cover the dashboard tasks developers otherwise handle in a browser.
AI Agent Skills
@orbitflare/skills targets the hallucination problem head-on. When you're coding in Claude Code, Cursor, or Codex, the model confidently generates Solana-specific code that's wrong roughly a third of the time—missing dataSlice on getProgramAccounts calls, confused proto field names between Jetstream and Yellowstone, invented CLI flags that don't exist. One npx @orbitflare/skills@latest command drops an SKILL.md file and a references/ folder into your agent's skills directory. The contents cover transport selection guidance, the two-key authentication model (query param for chain reads, header for Customer API), getTransactionsForAddress vs. the inefficient getSignaturesForAddress plus fan-out pattern, and exact YAML shapes for orbitflare jet and orbitflare grpc. After restarting your agent session, asking to "stream Pump.fun trades using OrbitFlare Jetstream" produces a correctly-configured jetstream.yml with proper account_include fields—the first guess becomes the right one.
MCP Server Integration
@orbitflare/mcp exposes 53 OrbitFlare capabilities as discrete MCP tools callable from any MCP host: Claude Desktop, Claude Code, Cursor, Windsurf, Codex CLI, Gemini CLI, or VS Code. Configuration is a standard JSON block pointing to npx @orbitflare/mcp@latest, with environment variables for the API key and network selection. The tool set maps cleanly to single-capability questions. "What's a competitive priority fee right now for an account that touches Raydium?" dispatches to getRecentPrioritizationFees and summarizes P50/P75/P90/P99 percentiles into chat. "Build me a Yellowstone gRPC config that streams Jupiter swap transactions from Frankfurt" calls subscribeTransactions with the appropriate filters. "Quote 1 SOL to USDC with 0.3% slippage" invokes getSwapQuote through Jupiter Metis routing, auto-resolving mint addresses and returning route details plus price impact. All 53 tools are independently callable without an internal action router, and setApiKey swaps either license key at runtime if you prefer not to store credentials in host configs. Documentation pages also surface as MCP resources under orbitflare://docs/* for grounded, quotable answers.
Key Takeaways
- The CLI is the fastest path for shell-based workflows—install via
cargo install orbitflareand you're querying in seconds - Skills eliminate AI hallucination on OrbitFlare specifics by injecting official docs directly into agent context
- MCP server works across Claude Desktop, Cursor, Windsurf, Codex CLI, Gemini CLI, and VS Code with the same JSON configuration shape
- All three entry points share the same SDK underneath: multi-endpoint failover, automatic reconnection, and key sanitization are universal
The Bottom Line
OrbitFlare's three-pronged approach acknowledges a real truth about developer tooling in 2026: we live in terminals, IDEs with AI assistants, and chat interfaces simultaneously. Forcing all three into the same workflow just created friction that pushed developers toward dashboards and curl commands instead of staying in their flow state. The CLI is immediately useful for anyone already living at the command line, but the MCP server and skills are the more interesting long-term bet—they're how you make OrbitFlare "just work" without ever having to leave your editor or explain which proto you're actually using.