Quickstart
LLM Provider
Configure which models to use and their API keys:- Environment variables:
.envfile:
- Command line:
openai: or google_genai:) and uses the appropriate API key.
Secrets
Browser
local_browser_path setting accepts these values:
"none"(default) - Use system-installed browser"auto"- Auto-detect browser location"/path/to/browser"- Specific browser binary path
Caching
Logging
Control logging behavior and verbosity:logs_dir is set (default: “blast-logs/”):
- All logs go to files based on their levels
- Only engine metrics shown in terminal
- Log file paths shown in endpoint messages
logs_dir is null:
- All logs go to terminal based on their levels
- Engine metrics are not shown
"debug"- Detailed debugging information"info"- General information"warning"- Warning messages"error"- Error messages only"critical"- Critical errors only
Next Steps
- Configure Constraints for resource management
- Learn about automatic Parallelism
- Explore Caching