Skip to main content

Getting started

What is BricksAI?

BricksAI is a platform for monitoring and securing generative AI applications. We take care of the boring parts, so you can focus on building your product, instead of yet another GenAI wrapper.

Start securing and monitoring your LLM requests

Step 1: Sign up for an account

Get started by signing into BricksAI. Create a new organization, or ask a colleague to invite you to an existing one.

Step 2: Connect your LLM providers

To connect your LLM providers, go to the Settings page. Under "LLM providers", click "Add provider", fill in all information, then click "Add".

Step 3: Create a proxy secret key

To create a proxy secret key, go to the Secret keys page. Click "Create a new secret key", fill in all information, then click "Create".

Step 4: Make a call to an LLM provider via BricksAI

To make a call to an LLM provider using BricksAI, make two changes in your code:

  1. Send your request to https://api.trybricks.ai/. This document lists all endpoints that we currently support. For example, to call the OpenAI completion endpoint, you should send a request to https://api.trybricks.ai/api/providers/openai/v1/chat/completions.

  2. In your request headers, replace your provider's API key with BricksAI's secret key.

Below are sample code snippets for calling OpenAI with BricksAI:

from openai import OpenAI

client = OpenAI(
api_key="your-bricks-secret-key",
base_url="https://api.trybricks.ai/api/providers/openai/v1",
)

# Call OpenAI as normal...

Done!

When you head back to your dashboard page, you should be able to see metrics of the request you just made.