#3493 openai.InternalServerError: Relay failed to localhost:11434

2001:41d0* Posted at: 2 hours ago 👁7

500 错误: API服务端内部错误 Relay failed to localhost:11434:Traceback (most recent call last):
File "videotrans\util\TestSrtTrans.py", line 19, in run
File "videotrans\translator\__init__.py", line 960, in run
File "videotrans\translator\_base.py", line 105, in run
File "tenacity\__init__.py", line 480, in call
File "videotrans\translator\_localllm.py", line 55, in _item_task
File "openai\_utils\_utils.py", line 286, in wrapper
File "openai\resources\chat\completions\completions.py", line 1192, in create
File "openai\_base_client.py", line 1297, in post
File "openai\_base_client.py", line 1070, in request
openai.InternalServerError: Relay failed to localhost:11434


=
system:Windows-10-10.0.19045-SP0
version:v3.97
frozen:True
language:zh
root_dir:H:/win-pyvideotrans-v3.97

1 Replies

这是由于 Ollama 本地大模型服务未正确启动或无法访问导致的。

  1. 启动 Ollama 服务:确保已在本地运行 Ollama。通常需要打开一个终端(如 CMD 或 PowerShell),输入命令 ollama serve 并回车,保持该窗口运行。
  2. 验证服务地址:在 pyVideoTrans 的翻译渠道设置中,确认“API接口地址”为 http://localhost:11434/v1
  3. 检查模型状态:在另一个终端窗口中,运行 ollama list 命令,确认你打算使用的模型(如 qwen:latest)已成功下载并显示在列表中。
  4. 重新测试:在 pyVideoTrans 的翻译渠道设置页面,点击“测试”按钮,检查是否能成功连接到 Ollama 服务。

请查阅相关文档:

Post Your Reply
Open source and free maintenance is not easy. If this project is helpful to you, please consider making a small donation to help the project continue to maintain and update.

Related resource