SDKs

OpenAI SDK

NeuralGate is fully compatible with the OpenAI SDK. Change only the base URL — your existing code works without modification.

Installation

pip
npm
yarn
pip install openai
npm install openai
yarn add openai

Python

from openai import OpenAI

client = OpenAI(
    api_key="ngk_your_key",                              # Your NeuralGate key
    base_url="https://api.computeshare.servequake.com/v1"  # ← Only change needed
)

# Everything else is identical to OpenAI's API
response = client.chat.completions.create(
    model="auto",   # or a specific model ID
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "What is quantum computing?"}
    ]
)
print(response.choices[0].message.content)

TypeScript / Node.js

import OpenAI from "openai";

const client = new OpenAI({
  apiKey: "ngk_your_key",
  baseURL: "https://api.computeshare.servequake.com/v1",
});

const response = await client.chat.completions.create({
  model: "auto",
  messages: [
    { role: "system", content: "You are a helpful assistant." },
    { role: "user", content: "What is quantum computing?" },
  ],
});

console.log(response.choices[0].message.content);

Streaming

from openai import OpenAI

client = OpenAI(
    api_key="ngk_your_key",
    base_url="https://api.computeshare.servequake.com/v1"
)

stream = client.chat.completions.create(
    model="auto",
    messages=[{"role": "user", "content": "Tell me a story"}],
    stream=True
)

for chunk in stream:
    content = chunk.choices[0].delta.content or ""
    print(content, end="", flush=True)

Environment variable

Set these once and the OpenAI SDK picks them up automatically:

export OPENAI_API_KEY="ngk_your_key"
export OPENAI_BASE_URL="https://api.computeshare.servequake.com/v1"
from openai import OpenAI

# No configuration needed — reads from environment
client = OpenAI()
response = client.chat.completions.create(
    model="auto",
    messages=[{"role": "user", "content": "Hello"}]
)

Using NeuralGate routing params

Pass NeuralGate-specific routing controls via extra_body:

response = client.chat.completions.create(
    model="auto",
    messages=[{"role": "user", "content": "Sensitive query"}],
    extra_body={
        "privacy_mode": True,      # Never use cloud fallback
        "tier": "fastest",         # Pick lowest-latency hoster
        "trusted_only": True,      # Only trust_score > 0.8
    }
)

Compatibility

FeatureSupported
chat.completions.create
Streaming (stream=True)
System / user / assistant roles
max_tokens, temperature
models.list
Function calling / toolsComing soon
Vision / image inputsComing soon
EmbeddingsComing soon
AudioNot planned