OpenAI Integration
This guide describes how you can integrate OpenAI with BricksAI to gain fine-grained monitoring and access control over your OpenAI calls.
Step 1. Add your OpenAI API key
Go to the Settings page. Under "LLM providers", click "Add provider", select "OpenAI", fill in all information, then click "Add".
Step 2. Create a proxy secret key
To create a proxy secret key, go to the Secret keys page. Click "Create a new secret key", fill in all information, then click "Create".
Step 3. Make a call to OpenAI via BricksAI
Here is a list of all currently supported OpenAI API endpoints.
Below are sample code snippets for calling OpenAI with BricksAI:
- Python SDK
- Python Langchain
- Node SDK
- Node LangChain
- curl
from openai import OpenAI
client = OpenAI(
api_key="your-bricks-secret-key",
base_url="https://api.trybricks.ai/api/providers/openai/v1",
)
# Call OpenAI as normal...
# Just update your environment variable!
export OPENAI_API_BASE="https://api.trybricks.ai/api/providers/openai/v1"
import OpenAI from "openai";
const openai = new OpenAI({
apiKey: "your-bricks-secret-key",
baseURL: "https://api.trybricks.ai/api/providers/openai/v1",
});
// Call OpenAI as normal...
import { OpenAI } from "@langchain/openai";
const model = new OpenAI({
openAIApiKey: "your-bricks-secret-key",
configuration: {
baseURL: "https://api.trybricks.ai/api/providers/openai/v1",
},
});
// Call OpenAI as normal...
curl --request POST \
--url https://api.trybricks.ai/api/providers/openai/v1/chat/completions \
--header 'Authorization: Bearer your-bricks-secret-key' \
--header 'Content-Type: application/json' \
--data '{
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "hi"
}
]
}'