首页龙虾技能列表 › Xiaoya Auto Doing — 小雅自动执行

Xiaoya Auto Doing — 小雅自动执行

v1.0.0

小雅自动执行工具。

0· 106·0 当前·0 累计
by @dongxingao·MIT-0
下载技能包
License
MIT-0
最后更新
2026/3/22
安全扫描
VirusTotal
无害
查看报告
OpenClaw
可疑
medium confidence
The skill mostly does what it says (automates login and page capture for whut.ai-augmented.com), but there are packaging and declaration inconsistencies (undeclared CLI dependency, missing wrapper file reference, and unclear credential declarations) that merit caution before installation.
评估建议
This skill appears to implement browser-driven login and page capture for whut.ai-augmented.com, but there are a few red flags to consider before installing or using it: - The script calls the external 'agent-browser' CLI, but the skill metadata does not list that binary as required. Verify you have a trusted agent-browser binary installed and understand its permissions. - SKILL.md mentions a 'scripts/whut-open' wrapper that is not present in the provided file manifest. Confirm which file you...
详细分析 ▾
用途与能力
The Python script's behavior matches the stated purpose (open pages, dismiss popups, fill credentials, capture page text). However the skill metadata declares no required binaries while the script calls an external CLI 'agent-browser' — that required binary is not listed. The SKILL.md also references a convenience wrapper 'scripts/whut-open' that is not present in the file manifest. These packaging/declaration mismatches are inconsistent with the stated purpose.
指令范围
Runtime instructions and the script limit their actions to driving agent-browser, reading credentials from environment or a local secret file, and writing a JSON page dump. The script does not send captured data to remote endpoints itself. This scope is consistent with automating browser login and capture, but it will collect full page text (potentially including sensitive info) and save it to latest_page_dump.json — users should expect capture/exfiltration of page content to local disk.
安装机制
There is no install spec (instruction-only plus one script file). Nothing is downloaded or extracted during install. This is lower risk, but the script requires the agent-browser CLI at runtime which the package does not declare.
凭证需求
The script legitimately needs WHUT_USERNAME/WHUT_PASSWORD or a path to a local secret file. Those credential sources are described in SKILL.md and used by the code, but the skill metadata does not declare any required environment variables. Also the skill suggests storing secrets in a file under the skill folder (./local/whut_ai_secret.json) which is a poor practice — packaging or placing credentials inside a skill folder increases risk of accidental disclosure.
持久化与权限
The skill is not always-enabled and does not request elevated platform privileges. It writes runtime output and may create/read a local secret file within its own folder, but it does not modify other skills or global configuration.
安全有层次,运行前请审查代码。

License

MIT-0

可自由使用、修改和再分发,无需署名。

运行时依赖

无特殊依赖

版本

latestv1.0.02026/3/21

Initial release of auto-xiaoya-doing skill. - Automates login and page capture for WHUT AI Augmented sites using agent-browser. - Supports credential input via environment variables or local secret files. - Provides scripts for opening target pages, bypassing popups, and capturing page questions/text. - Captured page data is saved to latest_page_dump.json for further analysis. - Includes documentation and workflow guidelines for effective use and automation. 第一个版本应该bug很多。 某些模型有道德限制,不愿意帮你,不知道怎么解决。 目前的版本需要你把完整的作业页面的URL发给它,基于agent browser登陆浏览器,然后完成任务。 有点麻烦。 正在尝试发明全自动化skill。

● 无害

安装命令 点击复制

官方npx clawhub@latest install ya
镜像加速npx clawhub@latest install ya --registry https://cn.clawhub-mirror.com

技能文档

Use this skill to open WHUT AI Augmented pages in an authenticated browser session and dump the current page text for downstream analysis.

Workflow

  • Ensure credentials 可用 之前 running script.
  • Run scripts/whut-打开 "URL" 对于 target page, 或 run 没有 arguments 到 打开 site root.
  • 读取 references/workflow.md 对于 operating conventions 如果 关注-up browser automation needed.
  • 读取 latest_page_dump.json 之后 execution 到 inspect captured page text 和 extracted questions.

Credential sources

The login script supports these credential sources, checked in this order:

  • WHUT_USERNAMEWHUT_PASSWORD environment variables
  • WHUT_SECRET_PATH environment 变量 pointing 到 JSON file 带有 username密码
  • ./local/whut_ai_secret.json inside skill folder

Do not package real credentials with the skill.

Bundled files

  • scripts/auto_login.py: main automation logic
  • scripts/whut-打开: convenience wrapper
  • references/workflow.md: usage conventions 和 关注-up operating notes

Notes

  • Keep secrets 在...中 local/whut_ai_secret.json 或 environment variables.
  • Treat latest_page_dump.json 作为 runtime 输出, 不 作为 reference content 到 distribute.
  • 如果 refs become stale 期间 browser automation, take fresh snapshot 之前 continuing.
数据来源:ClawHub ↗ · 中文优化:龙虾技能库
OpenClaw 技能定制 / 插件定制 / 私有工作流定制

免费技能或插件可能存在安全风险,如需更匹配、更安全的方案,建议联系付费定制

了解定制服务