Skip to content

docs: add WSL2 networking guide for local model servers#5616

Merged
teknium1 merged 1 commit intomainfrom
hermes/hermes-838c5b76
Apr 6, 2026
Merged

docs: add WSL2 networking guide for local model servers#5616
teknium1 merged 1 commit intomainfrom
hermes/hermes-838c5b76

Conversation

@teknium1
Copy link
Copy Markdown
Contributor

@teknium1 teknium1 commented Apr 6, 2026

Summary

Adds a dedicated WSL2 Networking section to the AI Providers docs page, covering how to connect from Hermes (running in WSL2) to model servers running on the Windows host.

This is a common pain point — WSL2's default NAT networking means localhost inside WSL2 refers to the Linux VM, not the Windows host. Users get "connection refused" when pointing Hermes at http://localhost:11434 for Ollama on Windows.

What's covered

  • Mirrored networking mode (Windows 11 22H2+) — the recommended fix, makes localhost work bidirectionally
  • NAT mode fallback — finding the Windows host IP via ip route, dynamic helper script, mDNS approach
  • Server bind address table — per-server instructions for Ollama, LM Studio, llama-server, vLLM, and SGLang (most default to 127.0.0.1 which rejects WSL2 NAT connections)
  • Detailed Ollama Windows service config for setting OLLAMA_HOST=0.0.0.0
  • Windows Firewall rules for allowing WSL2 traffic
  • Quick verification steps with curl
  • Troubleshooting cross-reference — added a "Connection refused from WSL2" entry in the Troubleshooting section that links back

Placement

New ### WSL2 Networking (Windows Users) section between LM Studio and Troubleshooting Local Models — positioned as a setup consideration alongside the server sections, not buried in troubleshooting.

Test plan

  • Verified no MDX parsing errors (only pre-existing skills.json build error)
  • Content reviewed against official Microsoft WSL2 networking docs
  • All admonition syntax (:::tip, :::note, :::caution) follows existing patterns

Windows users running Hermes in WSL2 with model servers on the Windows
host hit 'connection refused' because WSL2's NAT networking means
localhost points to the VM, not Windows.

Covers:
- Mirrored networking mode (Win 11 22H2+) — makes localhost work
- NAT mode fallback using the host IP via ip route
- Per-server bind address table (Ollama, LM Studio, llama-server,
  vLLM, SGLang)
- Detailed Ollama Windows service config for OLLAMA_HOST
- Windows Firewall rules for WSL2 connections
- Quick verification steps
- Cross-reference from Troubleshooting section
@github-actions
Copy link
Copy Markdown

github-actions bot commented Apr 6, 2026

⚠️ Supply Chain Risk Detected

This PR contains patterns commonly associated with supply chain attacks. This does not mean the PR is malicious — but these patterns require careful human review before merging.

⚠️ WARNING: Install hook files modified

These files can execute code during package installation or interpreter startup.

Files:

hermes_cli/setup.py

Automated scan triggered by supply-chain-audit. If this is a false positive, a maintainer can approve after manual review.

@teknium1 teknium1 merged commit 537a2b8 into main Apr 6, 2026
3 of 5 checks passed
dbmizrahi pushed a commit to dbmizrahi/hermes-agent that referenced this pull request Apr 10, 2026
…#5616)

Windows users running Hermes in WSL2 with model servers on the Windows
host hit 'connection refused' because WSL2's NAT networking means
localhost points to the VM, not Windows.

Covers:
- Mirrored networking mode (Win 11 22H2+) — makes localhost work
- NAT mode fallback using the host IP via ip route
- Per-server bind address table (Ollama, LM Studio, llama-server,
  vLLM, SGLang)
- Detailed Ollama Windows service config for OLLAMA_HOST
- Windows Firewall rules for WSL2 connections
- Quick verification steps
- Cross-reference from Troubleshooting section
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant