ProxyCast 提供完整的 OpenAI Chat Completions API 兼容。
POST /v1/chat/completions
Content-Type: application/json
Authorization: Bearer your-api-key
{
"model": "claude-sonnet-4-20250514",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
],
"temperature": 0.7,
"max_tokens": 1024,
"stream": false
}
| 参数 | 类型 | 必填 | 说明 |
|---|---|---|---|
| model | string | ✅ | 模型名称 |
| messages | array | ✅ | 消息列表 |
| temperature | number | ❌ | 温度 (0-2) |
| max_tokens | integer | ❌ | 最大输出 Token |
| stream | boolean | ❌ | 是否流式响应 |
| top_p | number | ❌ | 采样参数 |
| presence_penalty | number | ❌ | 存在惩罚 |
| frequency_penalty | number | ❌ | 频率惩罚 |
| stop | array | ❌ | 停止序列 |
| tools | array | ❌ | 工具定义 |
| tool_choice | string/object | ❌ | 工具选择策略 |
{
"role": "user",
"content": "Hello!"
}
支持的角色:system, user, assistant, tool
{
"id": "chatcmpl-xxx",
"object": "chat.completion",
"created": 1234567890,
"model": "claude-sonnet-4-20250514",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello! How can I help you today?"
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 10,
"completion_tokens": 20,
"total_tokens": 30
}
}
设置 stream: true 启用流式响应:
curl http://127.0.0.1:9090/v1/chat/completions \
-H "Authorization: Bearer your-api-key" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-20250514",
"messages": [{"role": "user", "content": "Hello!"}],
"stream": true
}'
响应格式(SSE):
data: {"id":"chatcmpl-xxx","choices":[{"delta":{"content":"Hello"}}]}
data: {"id":"chatcmpl-xxx","choices":[{"delta":{"content":"!"}}]}
data: [DONE]
GET /v1/models
Authorization: Bearer your-api-key
{
"object": "list",
"data": [
{
"id": "claude-sonnet-4-20250514",
"object": "model",
"created": 1234567890,
"owned_by": "anthropic"
},
{
"id": "gemini-2.0-flash",
"object": "model",
"created": 1234567890,
"owned_by": "google"
}
]
}
{
"model": "claude-sonnet-4-20250514",
"messages": [{"role": "user", "content": "What's the weather in Tokyo?"}],
"tools": [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather information",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string"}
},
"required": ["location"]
}
}
}
]
}
{
"choices": [
{
"message": {
"role": "assistant",
"tool_calls": [
{
"id": "call_xxx",
"type": "function",
"function": {
"name": "get_weather",
"arguments": "{\"location\":\"Tokyo\"}"
}
}
]
}
}
]
}
import openai
client = openai.OpenAI(
base_url="http://127.0.0.1:9090/v1",
api_key="your-api-key"
)
response = client.chat.completions.create(
model="claude-sonnet-4-20250514",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'http://127.0.0.1:9090/v1',
apiKey: 'your-api-key'
});
const response = await client.chat.completions.create({
model: 'claude-sonnet-4-20250514',
messages: [{ role: 'user', content: 'Hello!' }]
});
console.log(response.choices[0].message.content);