Integration · ai provider
Cypherz for OpenAI
Replace your `openai` import with `@cypherz/sdk/openai` and your code keeps working. Cypherz tokenizes emails, phone numbers, IDs, credit cards, and any custom pattern before the request leaves your server, then de-tokenizes the response inside your trust boundary. Works with `chat.completions`, `embeddings`, `responses`, and any custom endpoint you proxy through.
01
Drop-in client
Change one import line. Method signatures and response shapes are identical.
02
All endpoints
Chat completions, embeddings, responses, files, fine-tuning — anything OpenAI exposes, Cypherz can proxy.
03
BYO or managed key
Use your own `sk-…` key encrypted in the vault, or let Cypherz hold it so your developers never touch one.
04
Audit every call
Every tokenize / detokenize / proxy hit is logged with structured metadata. SOC2-ready audit trail.
Drop-in replacement for the OpenAI SDK
import { OpenAI } from "@cypherz/sdk/openai";
const client = new OpenAI({ cypherzKey: process.env.CYPHERZ_KEY });
const resp = await client.chat.completions.create({
model: "gpt-4o",
messages: [
{ role: "user", content: "Email john@acme.com a quote." },
],
});
// → The model saw: "Email <EMAIL_a1b2c3d4e5f6> a quote."
// → You received: "Email john@acme.com a quote."Common questions
Frequently asked.
Is Cypherz officially supported by OpenAI?
Cypherz is built as a transparent proxy compatible with OpenAI's public API. We don't require their endorsement and they don't gate us — your existing API key works through Cypherz.
Will my latency get worse?
Tokenization adds 5-30ms per request depending on payload size. The actual LLM call is the dominant cost (hundreds of ms), so the user-perceived difference is negligible. Run Cypherz in the same region as your AI provider for lowest hop overhead.
Does streaming work?
Yes — Cypherz proxies streaming responses and restores tokens chunk-by-chunk on the way back. Works with SSE and standard chunked transfer.
Can I use my own OpenAI key?
Yes — paste your key when you create a project; Cypherz encrypts it under the per-project vault key. You can also use managed mode where Cypherz provisions and bills the upstream key.
Get started
Add Cypherz to your OpenAI integration in 60 seconds.
Sign up, create a project, copy your API key. The first request is tokenized in under sixty seconds.
More integrations
Use Cypherz with Anthropic Claude
Tokenize PII before it reaches Claude. Cypherz proxies the Anthropic API transpa…
Use Cypherz with Google Gemini
Send sanitized payloads to Gemini Pro and Gemini Flash. Cypherz proxies the Goog…
Use Cypherz with LangChain
Plug Cypherz into LangChain so every chain, agent, and tool call gets PII-protec…
Use Cypherz with Vercel AI SDK
Wrap the Vercel AI SDK's `streamText` and `generateText` with Cypherz so your Ne…
Use Cypherz with Python
Use Cypherz from any Python AI workload. Works with the OpenAI Python SDK, Anthr…
Use Cypherz with TypeScript
First-class TypeScript SDK with a drop-in OpenAI wrapper. Ships on npm, works in…