Machine-Readable by Default
Every data command emits JSON. Pipe output directly into your agent workflows, CI pipelines, or dashboards.
Run global quickstart once, initialize each repo with vesai init, then use replay evidence to generate user stories, group stories, and research answers your agents can execute. Commands auto-sync the CLI to latest main before running.
From machine setup to project intelligence artifacts in four steps.
Configure global runtime + render memory budget
Create project-scoped .vesai config and workspace
Daemon backfills and continuously analyzes new sessions
Use user, group, and research commands for decisions
Real commands, real output. Every data command returns structured JSON by default.
Same commands, same outputs whether you're a human or a coding agent.
Every data command emits JSON. Pipe output directly into your agent workflows, CI pipelines, or dashboards.
Session, user, group, and research artifacts persist as git-friendly markdown in .vesai/workspace/ for long-lived agent context.
Global machine config stays in ~/.vesai while project credentials and artifacts stay in each repo's .vesai directory.
A comprehensive skill file teaches Claude Code, Codex, and other coding agents how to run quickstart/init plus user, group, and research workflows out of the box.
Session recordings & product analytics
Vertex AI, Gemini, Cloud Storage
Browser-based replay rendering
Video & frame processing
Authenticate gcloud:
gcloud auth login
gcloud auth application-default login
gcloud config set project <project-id>Every command auto-syncs VES AI to latest main before execution.
curl -fsSL https://ves.ai/install | bashvesai quickstart --max-render-memory-mb 8192vesai init --lookback-days 180vesai user <useremail>