Set your AI provider API key. Anthropic is the default:
export ANTHROPIC_API_KEY="sk-ant-..."
Refrain supports 7 AI providers (OpenAI, Google, Azure, Bedrock, Vertex, and OpenAI-compatible endpoints). See Set up your AI provider for full setup instructions and environment variables.
Before generating a runbook, you can optionally create:context.md — supplementary information about the target app (login URL, navigation hints, special UI patterns, etc.):
# App context- Login page: https://app.example.com/login- After login, the dashboard is at /dashboard- The "Export CSV" button is in the top-right toolbar
secrets.json — credentials and sensitive values when needed. All values are treated as sensitive:
Tell AI what you want to accomplish, and it will explore your web app and build a runbook.
npx @refrainai/cli generate -- \ --url "https://app.example.com/login" \ --goal "Log in and navigate to the dashboard" \ --output ./login-flow.yaml \ --context ./context.md \ --secrets ./secrets.json \ --headless false \ --model "claude-sonnet-4-6" \ --model-provider anthropic
Use --headless false to watch the AI explore your app in real time. Remove it for headless execution.
You can set default model and provider via environment variables instead of passing --model and --model-provider on every command. See default model configuration.
The execute command resolves selectors deterministically — no AI tokens are consumed on reruns (except when fallback features are triggered). Whether you run it once or a hundred times, the cost stays at zero.
Open the browser and navigate to the starting URL
For each step, resolve the selector to find the target element
Self-heal mode enables aggressive retry strategies, AI-powered repair suggestions, and a diagnostic report. Review the suggestions, then apply them with the fix-runbook command: