OpenRouter是一个统一的AI模型访问平台,让用户能够通过单一接口访问多个顶级AI模型,包括Horizon Alpha。
import requests
API_KEY = "your_openrouter_api_key"
MODEL = "openai/horizon-alpha"
response = requests.post(
"https://openrouter.ai/api/v1/chat/completions",
headers={
"Authorization": f"Bearer {API_KEY}",
"HTTP-Referer": "https://your-site.com",
"X-Title": "Your App Name"
},
json={
"model": MODEL,
"messages": [
{"role": "user", "content": "Hello, Horizon Alpha!"}
]
}
)
print(response.json())
# 温度参数:0.1-2.0
# 0.1 = 更保守、更一致
# 0.7 = 平衡(推荐)
# 2.0 = 更有创意、更随机
json={
"model": MODEL,
"messages": messages,
"temperature": 0.7,
"max_tokens": 1000
}
# 启用流式响应
json={
"model": MODEL,
"messages": messages,
"stream": true
}
# 处理流式响应
for chunk in response.iter_lines():
if chunk:
print(chunk.decode('utf-8'))
const response = await fetch(
'https://openrouter.ai/api/v1/chat/completions',
{
method: 'POST',
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
model: 'openai/horizon-alpha',
messages: [
{role: 'user', content: 'Hello!'}
]
})
}
);
curl -X POST https://openrouter.ai/api/v1/chat/completions \
-H "Authorization: Bearer sk-or-xxxxxx" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/horizon-alpha",
"messages": [{"role": "user", "content": "Hello!"}]
}'
const OpenAI = require('openai');
const openai = new OpenAI({
apiKey: 'sk-or-xxxxxx',
baseURL: 'https://openrouter.ai/api/v1'
});
const completion = await openai.chat.completions.create({
model: 'openai/horizon-alpha',
messages: [{role: 'user', content: 'Hello!'}]
});
| 计费项目 | 价格 | 说明 |
|---|---|---|
| 输入token | $0.002/1K | 按输入token数量计费 |
| 输出token | $0.006/1K | 按输出token数量计费 |
| 批量处理 | 折扣15% | 批量请求享受折扣 |
根据任务复杂度选择不同版本的Horizon Alpha,平衡性能和成本。
将多个小请求合并为批量处理,可享受15%的折扣优惠。
对重复性内容实施缓存,减少API调用次数,降低成本。
定期查看使用报告,优化使用模式,控制成本预算。
避开高峰期调用,享受更优惠的非高峰期定价。
import requests
import time
from typing import Optional
class HorizonAlphaClient:
def __init__(self, api_key: str):
self.api_key = api_key
self.base_url = "https://openrouter.ai/api/v1"
def call_model(self, messages: list, max_retries: int = 3) -> Optional[dict]:
"""调用Horizon Alpha模型,包含错误处理和重试机制"""
for attempt in range(max_retries):
try:
response = requests.post(
f"{self.base_url}/chat/completions",
headers={
"Authorization": f"Bearer {self.api_key}",
"Content-Type": "application/json"
},
json={
"model": "openai/horizon-alpha",
"messages": messages,
"temperature": 0.7
},
timeout=30
)
response.raise_for_status()
return response.json()
except requests.exceptions.RequestException as e:
print(f"请求失败 (尝试 {attempt + 1}/{max_retries}): {e}")
if attempt < max_retries - 1:
wait_time = 2 ** attempt # 指数退避
print(f"等待 {wait_time} 秒后重试...")
time.sleep(wait_time)
else:
print("达到最大重试次数,请求失败")
return None
return None
# 使用示例
client = HorizonAlphaClient("your-api-key")
messages = [
{"role": "user", "content": "Hello, Horizon Alpha!"}
]
result = client.call_model(messages)
print(result)
OpenRouter提供了统一的访问接口和更灵活的定价方案,让用户能够更方便地使用Horizon Alpha。
OpenRouter采用企业级加密和隐私保护措施,确保用户数据的安全性。
支持Python、JavaScript、cURL等多种编程语言的API调用。
立即在OpenRouter上注册,体验Horizon Alpha的强大功能