When Port 9 Ate My Transcript
A small debugging story about proxies, Codex CLI, and why localhost isn’t always your friend.
There’s a particular kind of bug that feels like someone's been messing with your network stack again.
You run a script from cmd.exe.
It works.
You run the exact same script from Codex CLI.
It stops with:
[WinError 10061] No connection could be made because the target machine actively refused it
FFmpeg happily splits the audio.
The Whisper API call dies immediately.
No 401.
No timeout.
Just an instant refusal.
The Setup
This is part of my ToucsProjectManager toolchain — a Python script that:
- Uses FFmpeg to split a video into MP3 segments.
- Calls llm whisper-api to transcribe each segment.
- Writes a clean transcript file next to the video.
From a normal Windows cmd shell it worked perfecly.
From Codex CLI? Instant failure.
So we did what engineers do: print environment variables and stare at them.
And that’s when we found the culprit:
HTTP_PROXY=http://127.0.0.1:9
HTTPS_PROXY=http://127.0.0.1:9
ALL_PROXY=http://127.0.0.1:9
NO_PROXY=localhost,127.0.0.1,::1
Why This Was So Weird
Port 9.
If you don’t know port 9: it’s basically the networking equivalent of /dev/null.
There is no proxy there.
There will never be a proxy there.
Every outbound HTTPS request was being forced through 127.0.0.1:9, which immediately refuses connections. That’s why the failure was instantaneous.
Why It Only Happened in Codex CLI
This was the key insight:
So the same script behaved differently depending on who launched it.
This is one of those subtle environment-leak issues that only shows up when you start composing tools:
- Python script
- Subprocess call to llm
- OpenAI API
- Proxy environment variables
- Codex execution wrapper
One tiny variable upstream can poison the entire networking stack downstream.
The Fix (Now Properly Engineered)
Instead of fighting the proxy variables set by codex, the new script is resilient by design.
The key change: the transcription subprocess now builds a controlled environment before calling llm whisper-api.
Here’s the core idea:
def build_llm_env(clear_proxy: bool) -> dict[str, str]:
env = os.environ.copy()
if clear_proxy:
for key in [
"HTTP_PROXY",
"HTTPS_PROXY",
"ALL_PROXY",
"http_proxy",
"https_proxy",
"all_proxy",
]:
env.pop(key, None)
return env
Each transcription worker launches llm whisper-api with that explicitly constructed environment:
env = build_llm_env(clear_proxy)
result = subprocess.run(
cmd,
check=True,
capture_output=True,
text=True,
env=env,
)
Opt-in behavior: --keep-proxy preserves the original proxy configuration.
That means:
- It works from cmd.exe.
- It works from Codex CLI.
- It works inside wrappers, IDE terminals, and automation runners.
- No manual environment adjustments required.
Bonus Stability Improvements
Since we were already inside the networking layer, we hardened the rest of the pipeline too:
- Parallel workers (default: 4) for faster transcription.
- Exponential backoff retries for transient API hiccups.
- Explicit key override via --llm-key.
- Network diagnostics mode via --diagnose-network.
The diagnostic mode performs DNS resolution, TCP connect, and TLS handshake checks against the OpenAI endpoint before doing any transcription work. If something is wrong at the socket level, you’ll know immediately.
python transcribe_video.py --diagnose-network
So instead of mysterious WinError 10061 explosions, we now have:
- Controlled environment
- Deterministic networking behavior
- Clear diagnostics
- Optional proxy preservation
Lessons Learned
- Connection refused is usually a proxy or routing issue, not an API/key issue.
- Localhost isn’t always your localhost when wrappers/sandboxes get involved.
- Subprocesses inherit environment — if you don’t control it, you don’t control behavior.
- Defensive scripting wins (sanitize network env, keep an escape hatch).
Why This Matters
This wasn’t just about Whisper.
It’s about how modern dev stacks behave when you chain LLM tools, run through wrappers, and rely on environment-driven configuration. Tiny hidden defaults can cascade into failures that look mysterious — but are actually deterministic.

Comments
Post a Comment
Please leave your comments on this topic: