Integration · language

Cypherz for Python

Python is where most ML and data pipelines live. Cypherz's Python SDK is a drop-in for the OpenAI and Anthropic Python clients — your model code is unchanged, your prompts are tokenized before they leave the process.

  • 01

    OpenAI Python SDK compatible

    Change `base_url`, keep your existing code.

  • 02

    Async + sync

    Both `OpenAI()` and `AsyncOpenAI()` paths work transparently.

  • 03

    Notebook friendly

    Import in Jupyter, run on hosted notebooks, ship to Streamlit.

Use the OpenAI Python SDK against Cypherz

from openai import OpenAI

client = OpenAI(
    api_key=os.environ["CYPHERZ_KEY"],
    base_url="https://api.cypherz.app/v1/proxy/openai/v1",
)

resp = client.chat.completions.create(
    model="gpt-4o",
    messages=[
        {"role": "user", "content": "Email john@acme.com a quote."},
    ],
)

Common questions

Frequently asked.

Is Cypherz officially supported by Python?

Cypherz is built as a transparent proxy compatible with Python's public API. We don't require their endorsement and they don't gate us — your existing API key works through Cypherz.

Will my latency get worse?

Tokenization adds 5-30ms per request depending on payload size. The actual LLM call is the dominant cost (hundreds of ms), so the user-perceived difference is negligible. Run Cypherz in the same region as your AI provider for lowest hop overhead.

Does streaming work?

Yes — Cypherz proxies streaming responses and restores tokens chunk-by-chunk on the way back. Works with SSE and standard chunked transfer.

Can I use my own Python key?

Yes — paste your key when you create a project; Cypherz encrypts it under the per-project vault key. You can also use managed mode where Cypherz provisions and bills the upstream key.

Get started

Add Cypherz to your Python integration in 60 seconds.

Sign up, create a project, copy your API key. The first request is tokenized in under sixty seconds.