#4393 TaskCfgSTS(is_cuda=False, uuid='39bed11d8d', cache_folder=None, target_dir='

240e:3ba* Posted at: 8 hours ago 👁19

翻译字幕阶段出错 [兼容AI/本地模型] Connection error.
Traceback (most recent call last):
File "httpx\_transports\default.py", line 101, in map_httpcore_exceptions
File "httpx\_transports\default.py", line 250, in handle_request
File "httpcore\_sync\connection_pool.py", line 216, in handle_request
File "httpcore\_sync\connection_pool.py", line 196, in handle_request
File "httpcore\_sync\http_proxy.py", line 207, in handle_request
File "httpcore\_sync\connection.py", line 99, in handle_request
File "httpcore\_sync\connection.py", line 76, in handle_request
File "httpcore\_sync\connection.py", line 122, in _connect
File "httpcore\_backends\sync.py", line 205, in connect_tcp
File "contextlib.py", line 153, in exit
File "httpcore\_exceptions.py", line 14, in map_exceptions
httpcore.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "openai\_base_client.py", line 1005, in request
File "httpx\_client.py", line 914, in send
File "httpx\_client.py", line 942, in _send_handling_auth
File "httpx\_client.py", line 979, in _send_handling_redirects
File "httpx\_client.py", line 1014, in _send_single_request
File "httpx\_transports\default.py", line 249, in handle_request
File "contextlib.py", line 153, in exit
File "httpx\_transports\default.py", line 118, in map_httpcore_exceptions
httpx.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "videotrans\task\job.py", line 173, in run
File "videotrans\task\_translate_srt.py", line 44, in trans
File "videotrans\translator\__init__.py", line 1000, in run
File "videotrans\translator\_base.py", line 100, in run
File "tenacity\__init__.py", line 480, in call
File "videotrans\translator\_localllm.py", line 49, in _item_task
File "openai\_utils\_utils.py", line 286, in wrapper
File "openai\resources\chat\completions\completions.py", line 1211, in create
File "openai\_base_client.py", line 1297, in post
File "openai\_base_client.py", line 1037, in request
openai.APIConnectionError: Connection error.
TaskCfgSTS(is_cuda=False, uuid='39bed11d8d', cache_folder=None, target_dir='

7 Replies

这是连接到配置的 本地大模型 API 服务 时失败,原因通常是:API 地址填写错误,或对应的本地服务没有启动

解决方法

  1. 打开软件菜单 → 工具/选项翻译设置本地大模型设置,检查填写的 API 地址 是否为可访问的正确地址(例如 http://127.0.0.1:11434/v1),确认端口号与实际服务一致。
  2. 确保你的本地大模型服务(如 Ollama、vLLM 等)已经正常启动,并且能够通过浏览器或 curl 访问该地址。
  3. 检查本机防火墙或安全软件是否拦截了软件的网络访问,可尝试临时关闭后进行测试。
  4. 如果使用了网络代理,请确认代理设置不会干扰本地地址(127.0.0.1localhost),必要时在主界面 网络代理地址 中留空或排除本地地址。

请查阅相关文档:

还是不行,3.99版本

错误依旧是连接目标被积极拒绝,说明软件仍然无法到达你填写的本地大模型地址。请按以下步骤进一步排查:

  1. 确认服务真正在运行且地址完全正确
    打开命令行终端,执行 curl http://127.0.0.1:11434/v1/models(假设你用的是 Ollama,端口11434;若用的其他服务请替换为实际地址)。

    • 若返回错误,说明服务未启动或端口不对;请重新启动你的本地大模型服务,并核对端口号。
    • 若返回正常(例如列出模型列表),则继续下一步。
  2. 检查本地大模型设置中的 API 地址格式
    在软件菜单 工具/选项 → 翻译设置 → 本地大模型设置 中,确保填写的地址 只包含协议 + IP + 端口 + /v1(例如 http://127.0.0.1:11434/v1),不要以斜杠结尾,不要包含 /chat/completions 等额外路径。
    如果地址正确,尝试在地址中加入 http://127.0.0.1:11434/v1/(末尾加斜杠)或 http://localhost:11434/v1,更换写法测试。
  3. 确认模型名称填写正确
    在同一个设置界面,翻译模型 处必须填写本地服务中真实存在的模型名(例如 qwen:latestllama3)。你可以在终端执行 curl http://127.0.0.1:11434/v1/models 查看可用模型列表,然后准确复制名称。
  4. 检查是否需要 API Key
    即使是本地大模型,部分服务(如 vLLM)也可能要求请求头中携带一个任意字符串。请在 本地大模型设置Api Key 栏里随意填写一个字符串(例如 local),不要留空。
  5. 关闭可能冲突的代理
    检查主界面的 网络代理地址 是否填写了内容。如果填写了代理,可能会导致本地 127.0.0.1 请求也被转发至代理服务器而被拒绝。请将该代理地址 完全清空 后重试。
  6. 确认 V3.99 版本是否完整
    在软件目录找到 logs/ 文件夹,打开当天最新的 .log 文件,搜索 localllm 查看是否有更详细的报错。若有,请复制底部约 30 行日志,有助于进一步定位。

如果以上步骤执行后依然相同报错,说明本地服务存在更深层的问题,可尝试重启电脑、检查防火墙或更换其他翻译渠道(如免费的 Edge-TTS 或 OpenAI API)暂时绕过。

请查阅相关文档:

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "openai\_base_client.py", line 1005, in request
File "httpx\_client.py", line 914, in send
File "httpx\_client.py", line 942, in _send_handling_auth
File "httpx\_client.py", line 979, in _send_handling_redirects
File "httpx\_client.py", line 1014, in _send_single_request
File "httpx\_transports\default.py", line 249, in handle_request
File "contextlib.py", line 153, in exit
File "httpx\_transports\default.py", line 118, in map_httpcore_exceptions
httpx.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "videotrans\task\job.py", line 173, in run
File "videotrans\task\_translate_srt.py", line 44, in trans
File "videotrans\translator\__init__.py", line 1000, in run
File "videotrans\translator\_base.py", line 100, in run
File "tenacity\__init__.py", line 480, in call
File "videotrans\translator\_localllm.py", line 49, in _item_task
File "openai\_utils\_utils.py", line 286, in wrapper
File "openai\resources\chat\completions\completions.py", line 1211, in create
File "openai\_base_client.py", line 1297, in post
File "openai\_base_client.py", line 1037, in request
openai.APIConnectionError: Connection error.
source_code='English',target_code='Simplified Chinese'
进入执行字幕翻译阶段
self.trans_thread=50,self.aisendsrt=True,self.translate_type=8
[INFO] Starting call to 'videotrans.translator._localllm.LocalLLM._item_task', this is the 1st time calling it.
[INFO] Finished call to 'videotrans.translator._localllm.LocalLLM._item_task' after 7.500(s), this was the 1st time calling it.
[INFO] Starting call to 'videotrans.translator._localllm.LocalLLM._item_task', this is the 2nd time calling it.
[INFO] Finished call to 'videotrans.translator._localllm.LocalLLM._item_task' after 19.781(s), this was the 2nd time calling it.
[INFO] Starting call to 'videotrans.translator._localllm.LocalLLM._item_task', this is the 3rd time calling it.

错误仍然是 [WinError 10061] 由于目标计算机积极拒绝,无法连接,日志显示翻译渠道类型为 trans_ type=8(本地大模型),并尝试了 3 次均失败。这说明软件始终无法与本地大模型的 API 服务建立 TCP 连接。请按以下步骤逐一排查:

  1. 验证本地服务是否真的在运行

    • 打开浏览器或命令行,访问你在软件中填写的 API 地址(例如 http://127.0.0.1:11434/v1/models)。
    • 若无法访问,说明服务未启动。请启动你的本地服务(如 Ollama、vLLM 等)。
    • 若返回错误信息(非拒绝连接),检查端口号是否正确,或服务是否配置正确。
  2. 检查软件中填写的 API 地址格式

    • 进入 菜单 → 工具/选项 → 翻译设置 → 本地大模型设置
    • 确认 API 地址仅包含协议、IP 和端口,并以 /v1 结尾(例如 http://127.0.0.1:11434/v1)。
    • 不要添加任何额外路径(如 /chat/completions)。
    • 尝试将 127.0.0.1 改为 localhost,或末尾加上斜杠(如 http://127.0.0.1:11434/v1/)进行测试。
  3. 确认模型名称存在且正确

    • 在同一设置界面,翻译模型 必须填写本地服务中实际部署的模型名称(如 qwen:latest)。
    • 可以在命令行执行 curl http://127.0.0.1:11434/v1/models 查看可用模型,并严格复制模型名。
  4. 检查 API Key 是否为空

    • 部分本地服务虽不需要认证,但我们的请求会携带一个 Authorization: Bearer 头。如果留空可能导致某些服务拒绝。
    • 请在 本地大模型设置Api Key 栏随意填写一个字符串(如 localnone),不要留空。
  5. 关闭网络代理

    • 主界面上的 网络代理地址 若填写了内容(如 http://127.0.0.1:xxxx),可能会将本应直连的请求转发到代理端口,导致连接拒绝。
    • 请将该代理地址 完全清空 后再试。
  6. 检查防火墙 / 安全软件

    • 临时关闭 Windows 防火墙或第三方安全软件,确认其是否拦截了 Python / 软件的网络请求。
    • 若关闭后能正常连接,请将 sp.exe 或 Python 进程加入防火墙白名单。
  7. 确认软件版本完整性

  8. 提供详细日志
    若上述步骤仍无法解决,请复制日志文件(在 logs/ 文件夹,找到当天最新的 .log 文件)的 最后 30 行,以便进一步分析。

注:[WinError 10061] 表示目标端口没有服务监听,核心原因是“服务没开”或“地址/端口填写错误”。务必确认你的本地大模型 API 服务确实在运行且监听对应端口。

请查阅相关文档:

[DEBUG] [字幕翻译]渠道8,7shi/llama-translate:8b-q4_K_M:共耗时:32s
[ERROR] Connection error.
Traceback (most recent call last):
File "httpx\_transports\default.py", line 101, in map_httpcore_exceptions
File "httpx\_transports\default.py", line 250, in handle_request
File "httpcore\_sync\connection_pool.py", line 216, in handle_request
File "httpcore\_sync\connection_pool.py", line 196, in handle_request
File "httpcore\_sync\http_proxy.py", line 207, in handle_request
File "httpcore\_sync\connection.py", line 99, in handle_request
File "httpcore\_sync\connection.py", line 76, in handle_request
File "httpcore\_sync\connection.py", line 122, in _connect
File "httpcore\_backends\sync.py", line 205, in connect_tcp
File "contextlib.py", line 153, in exit
File "httpcore\_exceptions.py", line 14, in map_exceptions
httpcore.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "openai\_base_client.py", line 1005, in request
File "httpx\_client.py", line 914, in send
File "httpx\_client.py", line 942, in _send_handling_auth
File "httpx\_client.py", line 979, in _send_handling_redirects
File "httpx\_client.py", line 1014, in _send_single_request
File "httpx\_transports\default.py", line 249, in handle_request
File "contextlib.py", line 153, in exit
File "httpx\_transports\default.py", line 118, in map_httpcore_exceptions
httpx.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "videotrans\task\job.py", line 173, in run
File "videotrans\task\_translate_srt.py", line 44, in trans
File "videotrans\translator\__init__.py", line 1000, in run
File "videotrans\translator\_base.py", line 100, in run
File "tenacity\__init__.py", line 480, in call
File "videotrans\translator\_localllm.py", line 49, in _item_task
File "openai\_utils\_utils.py", line 286, in wrapper
File "openai\resources\chat\completions\completions.py", line 1211, in create
File "openai\_base_client.py", line 1297, in post
File "openai\_base_client.py", line 1037, in request
openai.APIConnectionError: Connection error.
[ERROR] Connection error.
Traceback (most recent call last):
File "httpx\_transports\default.py", line 101, in map_httpcore_exceptions
File "httpx\_transports\default.py", line 250, in handle_request
File "httpcore\_sync\connection_pool.py", line 216, in handle_request
File "httpcore\_sync\connection_pool.py", line 196, in handle_request
File "httpcore\_sync\http_proxy.py", line 207, in handle_request
File "httpcore\_sync\connection.py", line 99, in handle_request
File "httpcore\_sync\connection.py", line 76, in handle_request
File "httpcore\_sync\connection.py", line 122, in _connect
File "httpcore\_backends\sync.py", line 205, in connect_tcp
File "contextlib.py", line 153, in exit
File "httpcore\_exceptions.py", line 14, in map_exceptions
httpcore.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "openai\_base_client.py", line 1005, in request
File "httpx\_client.py", line 914, in send
File "httpx\_client.py", line 942, in _send_handling_auth
File "httpx\_client.py", line 979, in _send_handling_redirects
File "httpx\_client.py", line 1014, in _send_single_request
File "httpx\_transports\default.py", line 249, in handle_request
File "contextlib.py", line 153, in exit
File "httpx\_transports\default.py", line 118, in map_httpcore_exceptions
httpx.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "videotrans\task\job.py", line 173, in run
File "videotrans\task\_translate_srt.py", line 44, in trans
File "videotrans\translator\__init__.py", line 1000, in run
File "videotrans\translator\_base.py", line 100, in run
File "tenacity\__init__.py", line 480, in call
File "videotrans\translator\_localllm.py", line 49, in _item_task
File "openai\_utils\_utils.py", line 286, in wrapper
File "openai\resources\chat\completions\completions.py", line 1211, in create
File "openai\_base_client.py", line 1297, in post
File "openai\_base_client.py", line 1037, in request
openai.APIConnectionError: Connection error.

错误 [WinError 10061] 由于目标计算机积极拒绝,无法连接 表明软件无法与本地大模型 API 服务建立连接,核心原因是服务未运行或地址配置错误。

请按以下顺序排查:

  1. 验证服务是否在运行
    在命令行执行 curl http://127.0.0.1:11434/v1/models(请替换为实际端口)。若提示“连接失败”,说明服务未启动。请启动你的本地大模型服务(如 Ollama、vLLM)。
  2. 检查软件 API 地址格式
    打开 菜单 → 工具/选项 → 翻译设置 → 本地大模型设置,确保 API 地址仅包含协议、IP 和端口,并以 /v1 结尾,例如 http://127.0.0.1:11434/v1。不要添加 /chat/completions 等额外路径。
  3. 确认模型名称正确
    在相同界面,模型名必须与本地服务中的完全一致。可通过 curl http://127.0.0.1:11434/v1/models 查看可用模型,复制模型名(如 qwen:latest)。
  4. 清空网络代理
    主界面的“网络代理地址”必须留空,否则本地请求会被错误转发。
  5. 关闭防火墙/安全软件
    暂时禁用 Windows 防火墙或第三方安全软件,验证是否被拦截。正常后可将 sp.exe 加入信任列表。
  6. 填写 API Key
    即使本地服务无需认证,也请在“Api Key”栏填入任意字符串(如 local),留空可能导致拒绝。
  7. 检查端口占用
    如果上面的步骤都正确,尝试将端口改为其他值(如 11435)并重启服务,然后在软件中填写对应地址。

若完成以上步骤后仍然失败,请收集日志文件(logs/ 文件夹下当天最新的 .log 文件)的最后 30 行,以便进一步分析错误细节。

请查阅相关文档:

Post Your Reply
Open source and free maintenance is not easy. If this project is helpful to you, please consider making a small donation to help the project continue to maintain and update.

Related resource