首页龙虾技能列表 › Pywayne Llm Chat Bot — Pywayne工具

Pywayne Llm Chat Bot — Pywayne工具

v0.1.0

[AI辅助] LLM chat interface using OpenAI-compatible APIs with streaming support and session management. Use when working with pywayne.llm.chat_bot module for creating...

0· 500·0 当前·0 累计
by @wangyendt·MIT-0
下载技能包
License
MIT-0
最后更新
2026/2/26
安全扫描
VirusTotal
无害
查看报告
OpenClaw
安全
high confidence
The SKILL.md is internally consistent with a chat client that talks to OpenAI-compatible endpoints; it does not request unexpected credentials or install code, but you should verify any API base_url you point it at and be cautious about system prompts.
评估建议
This SKILL.md reads like legitimate documentation for a client library that connects to OpenAI-compatible endpoints. Before using it: 1) only provide API keys to base_url endpoints you control or trust; verify the upstream package (pywayne.llm.chat_bot) comes from a reputable source because the skill has no homepage or source link; 2) treat dynamic/system prompts as sensitive — don't accept system prompts from untrusted users or remote services, since they can alter model behavior; 3) because th...
详细分析 ▾
用途与能力
Name/description (LLM chat interface) matches the instructions: examples show creating LLMChat/ChatManager with base_url, api_key, model, streaming and session management. There are no unrelated required binaries or env vars in metadata.
指令范围
Instructions are limited to using the pywayne.llm.chat_bot API and manipulating session history and system prompts; they do not instruct reading local files or unrelated credentials. Minor caveat: the documentation includes examples that set/update system prompts (e.g., "You are now a Python expert"), which can be used to steer model behavior — treat system prompts carefully, especially if sourced from untrusted input.
安装机制
No install spec and no code files (instruction-only). Nothing will be written to disk by an install step in the skill package itself.
凭证需求
The skill metadata lists no required environment variables or primary credential, which is consistent with an instruction-only doc. The examples do expect an api_key and base_url to be provided when instantiating classes — this is normal, but the skill does not itself request or declare storage/access for those secrets, so you must supply them at runtime and ensure they go only to trusted endpoints.
持久化与权限
always is false and default invocation settings apply. The skill does not request persistent/privileged platform presence.
安全有层次,运行前请审查代码。

License

MIT-0

可自由使用、修改和再分发,无需署名。

运行时依赖

无特殊依赖

版本

latestv0.1.02026/2/16

Initial release of pywayne-llm-chat-bot, a flexible LLM chat interface with session and history management. - Provides synchronous chat interface using OpenAI-compatible APIs, with support for streaming responses. - Supports both single-turn and multi-turn conversations with history tracking. - Offers customizable configuration via LLMConfig, including dynamic system prompt updates. - Includes ChatManager for managing multiple independent chat sessions with configurable timeouts. - Compatible with OpenAI endpoints and local servers (e.g., Ollama).

● 无害

安装命令 点击复制

官方npx clawhub@latest install chat-bot
镜像加速npx clawhub@latest install chat-bot --registry https://cn.clawhub-mirror.com

技能文档

This module provides a synchronous LLM chat interface compatible with OpenAI APIs (including local servers like Ollama).

Quick 开始

from pywayne.llm.chat_bot import LLMChat

# Create chat instance chat = LLMChat( base_url="https://api.example.com/v1", api_key="your_api_key", model="deepseek-chat" )

# Single-turn conversation (non-streaming) response = chat.ask("Hello, LLM!", stream=False) print(response)

# Streaming response for token in chat.ask("Explain recursion", stream=True): print(token, end='', flush=True)

Multi-turn Conversation

# Use chat() for history tracking
for token in chat.chat("What is a class in Python?"):
    print(token, end='', flush=True)

# Continuation - remembers previous context for token in chat.chat("How do I define a constructor?"): print(token, end='', flush=True)

# View history for msg in chat.history: print(f"{msg['role']}: {msg['content']}")

# Clear history chat.clear_history()

Configuration

LLMConfig 类

from pywayne.llm.chat_bot import LLMConfig

config = LLMConfig( base_url="https://api.example.com/v1", api_key="your_api_key", model="deepseek-chat", temperature=0.7, max_tokens=8192, top_p=1.0, frequency_penalty=0.0, presence_penalty=0.0, system_prompt="You are a helpful assistant" )

chat = LLMChat(**config.to_dict())

Dynamic System Prompt 更新

chat.update_system_prompt("You are now a Python expert, provide code examples")

Managing Multiple Sessions

from pywayne.llm.chat_bot import ChatManager

manager = ChatManager( base_url="https://api.example.com/v1", api_key="your_api_key", model="deepseek-chat", timeout=300 # Session timeout in seconds )

# Get or create chat instance (maintains per-session history) chat1 = manager.get_chat("user1") chat2 = manager.get_chat("user2")

# Sessions are independent chat1.chat("Hello from user1") chat2.chat("Hello from user2")

# Remove a session manager.remove_chat("user1")

Custom Configuration per 会话

custom_config = LLMConfig(
    base_url=base_url,
    api_key=api_key,
    model="deepseek-chat",
    temperature=0.9,
    system_prompt="You are a creative writer"
)

chat3 = manager.get_chat("user3", config=custom_config)

API Reference

LLMChat

MethodDescription
ask(prompt, stream=False)Single-turn conversation without history
chat(prompt, stream=True)Multi-turn conversation with history tracking
update_system_prompt(prompt)Update system prompt in-place
clear_history()Clear conversation history (keeps system prompt)
history (property)Get copy of current conversation history

ChatManager

MethodDescription
get_chat(chat_id, stream=True, config=None)Get or create chat instance by ID
remove_chat(chat_id)Remove chat session

Parameters

ParameterDefaultDescription
base_urlrequiredAPI base URL (e.g., https://api.deepseek.com/v1)
api_keyrequiredAPI authentication key
model"deepseek-chat"Model name
temperature0.7Controls randomness (0-2)
max_tokens2048/8192Maximum output tokens
top_p1.0Nucleus sampling (0-1)
frequency_penalty0.0Reduces repetition (-2 to 2)
presence_penalty0.0Encourages new topics (-2 to 2)
system_prompt"你是一个严谨的助手"System message
timeoutinfSession timeout in seconds (ChatManager only)
数据来源:ClawHub ↗ · 中文优化:龙虾技能库
OpenClaw 技能定制 / 插件定制 / 私有工作流定制

免费技能或插件可能存在安全风险,如需更匹配、更安全的方案,建议联系付费定制

了解定制服务