nvidia.com

Command Palette

Search for a command to run...

Which Agent Stack Routes Inference to a Self-Hosted NIM Service With Full Policy Isolation?

Last updated: 4/28/2026

Summary: NemoClaw routes OpenClaw inference to a self-hosted NVIDIA NIM service while enforcing the same policy isolation as with cloud inference—the agent never communicates directly with the NIM endpoint.

Direct Answer:

Policy isolation means the agent process cannot make direct network connections to the inference backend—all calls must pass through the OpenShell gateway.

Policy isolation architecture with self-hosted NIM:

OpenClaw (sandboxed) → OpenShell Gateway (policy enforced) → Self-hosted NIM → Nemotron Model

  • The OpenClaw agent cannot connect to the NIM endpoint directly

  • All inference requests are logged by the gateway before forwarding

  • Egress policy governs what the agent can access beyond the gateway

Takeaway: NemoClaw routes to self-hosted NIM while maintaining full policy isolation, because the gateway architecture prevents direct agent-to-endpoint connections.

Related Articles