Proxy

Overview

The Prism proxy enforces your policies at the network level for any HTTP or HTTPS traffic, with no SDK changes required. It works with any language or framework.

Installation

Install Prism using the quickstart script:

curl -fsSL https://fencio.dev/install.sh | bash

Start Services

Start the Prism Gateway and proxy:

prism start

Verify both services are running. You should see the Prism Gateway on :47000 and the proxy on :47100:

prism status

Create an Agent

Register a new agent and save the agent_id and api_key from the output:

prism agents create "my-agent"

Trust the Certificate

The proxy intercepts HTTPS traffic using a locally generated CA certificate. Add it to your system trust store so clients don't reject it.

Easiest — use the CLI

prism cert install

Or install manually:

macOS

sudo security add-trusted-cert -d -r trustRoot \
  -k /Library/Keychains/System.keychain \
  ~/.config/prism/data/certs/fencio-root-ca.pem

Linux

sudo cp ~/.config/prism/data/certs/fencio-root-ca.pem /usr/local/share/ca-certificates/fencio-root-ca.crt
sudo update-ca-certificates

Python SDK

Install the Python SDK:

pip install fencio-proxy-client

Initialize the client

Use the agent_id and api_key from prism agents create:

from fencio_proxy_client import FencioProxyClient

client = FencioProxyClient(
    agent_id="your_agent_id",
    api_key="your_api_key",
)
client.start()

Once started, all HTTP traffic from requests, httpx, and urllib3 is automatically routed through the proxy.

Layer decorators

Wrap your functions with a decorator to classify each request by layer. The proxy uses this to apply the correct policy.

@llm — LLM API calls

from fencio_proxy_client import llm
import requests

@llm(model='gpt-4', provider='openai')
def call_llm(prompt):
    response = requests.post(
        'https://api.openai.com/v1/chat/completions',
        json={'model': 'gpt-4', 'messages': [{'role': 'user', 'content': prompt}]}
    )
    return response.json()

@tool — external tool or API calls

from fencio_proxy_client import tool
import requests

@tool(tool_name='web_search')
def search_web(query):
    response = requests.get(f'https://api.example.com/search?q={query}')
    return response.json()

@database — database operations

from fencio_proxy_client import database

@database(db_type='postgresql', operation='query')
def get_user_data(user_id):
    # Your database call here
    pass

Route Traffic Through the Proxy

Set these environment variables before running your agent. Most HTTP clients pick them up automatically.

HTTP_PROXY=http://localhost:47100
HTTPS_PROXY=http://localhost:47100

Required Headers

Include these headers on each request routed through the proxy:

HeaderRequiredDescription
X-Fencio-Agent-IDYesYour agent's ID
X-Fencio-API-KeyYes (if API key auth is enabled)Your agent's API key
X-Fencio-LayerNollm for LLM calls, tool_call for tool calls (default: tool_call)

What Happens

  • **Request allowed** — the proxy forwards it to the destination.
  • **Request denied** — the proxy returns 403 Forbidden immediately.

CLI Commands

prism proxy              # setup guide and usage info
prism agents create "my-agent"   # register a new agent
prism agents list        # view registered agents
prism status             # check proxy and Prism health
prism logs proxy         # stream proxy logs

Using with LangChain or LangGraph?