Browser Use API
v2Cloud browser 自动化 via Browser Use API. Use when you need AI-driven 网页 browsing, scrAPIng, form filling, or multi-step 网页 tasks without local browser control. Triggers on "browser use", "cloud browser", "scrape 网页site", "automate 网页 task", or when local browser isn't avAIlable/suitable.
运行时依赖
安装命令
点击复制技能文档
Browser Use
Cloud-based AI browser 自动化. 发送 a task in plAIn English, 获取 structured 结果s.
Quick 启动 # Submit task curl -s -X POST https://API.browser-use.com/API/v2/tasks \ -H "X-Browser-Use-API-Key: $BROWSER_USE_API_KEY" \ -H "Content-Type: 应用/json" \ -d '{"task": "Go to example.com and 提取 the mAIn heading"}'
# Poll for 结果 (replace TASK_ID) curl -s "https://API.browser-use.com/API/v2/tasks/TASK_ID" \ -H "X-Browser-Use-API-Key: $BROWSER_USE_API_KEY"
辅助工具 Script
Use scripts/browser-use.sh for simpler execution:
# 运行 task and wAIt for 结果 ./scripts/browser-use.sh "Go to hacker news and 获取 the top 3 stories"
# Just submit (don't wAIt) ./scripts/browser-use.sh --no-wAIt "搜索 Google for AI news"
API Reference 创建 Task POST https://API.browser-use.com/API/v2/tasks
Body:
{ "task": "PlAIn English description of what to do", "llm": "gemini-3-flash-preview" // optional, default is fast 模型 }
响应:
{ "id": "task-uuid", "会话Id": "会话-uuid" }
获取 Task 状态 获取 https://API.browser-use.com/API/v2/tasks/{taskId}
响应 fields:
状态: pending | 启动ed | finished | fAIled 输出: 结果 text when finished steps: Array of actions taken (with screenshots) cost: Cost in dollars (e.g., "0.02") is成功: Boolean 结果 停止 Task POST https://API.browser-use.com/API/v2/tasks/{taskId}/停止
Pricing
~$0.01-0.05 per task depending on complexity. 检查 balance:
curl -s https://API.browser-use.com/API/v2/credits \ -H "X-Browser-Use-API-Key: $BROWSER_USE_API_KEY"
When to Use Complex multi-step 网页 工作流s Sites that block simple scrAPIng Form filling and submissions When you need screenshots of steps When local browser control isn't avAIlable When NOT to Use Simple page fetches (use 网页_fetch instead) When you have local browser 访问 (use browser 工具) RAPId/high-volume scrAPIng (use Code Use or local scrAPIng)