Back to Home

Rodex AI Documentation

OpenAI-Compatible API with Engineering Excellence Built-In

Authentication
Simple token-based authentication

All API requests require the following authorization header:

Authorization: Bearer Rodex

When using OpenAI SDK, set api_key="Rodex"

Rodex Features
Built for developers who demand excellence

Coding-Focused AI

Specialized in software engineering, best practices, and clean code

Smart Model Selection

Use model: "rodex" to auto-select the fastest available model

Custom Instructions

Add custom_instructions parameter to personalize responses

API Endpoints
OpenAI-compatible endpoints with Rodex enhancements

Chat Completions

POST https://api-rodex-cli.vercel.app/api/v1/chat/completions

List Models

GET https://api-rodex-cli.vercel.app/api/v1/models
Supported Models
All models require the "rodex-" prefix

Auto-Select (Fastest)

rodex

Automatically picks the fastest responding model

Groq (Ultra Fast)

rodex-llama-3.3-70b-versatilerodex-gemma2-9b-itrodex-mixtral-8x7b-32768

XAI Grok

rodex-grok-betarodex-grok-vision-beta

Google Gemini

rodex-gemini-2.0-flash-exprodex-gemini-1.5-prorodex-gemini-1.5-flash

OpenRouter

rodex-anthropic/claude-3.5-sonnetrodex-openai/gpt-4-turbo
Example Usage
Using Rodex with various clients and languages

Basic cURL Request

curl https://api-rodex-cli.vercel.app/api/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer Rodex" \
  -d '{
    "model": "rodex-gemini-2.0-flash-exp",
    "messages": [
      {"role": "user", "content": "Write a Python function to sort a list"}
    ]
  }'

With Custom Instructions

curl https://api-rodex-cli.vercel.app/api/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer Rodex" \
  -d '{
    "model": "rodex",
    "messages": [
      {"role": "user", "content": "Build a REST API"}
    ],
    "custom_instructions": "Use TypeScript and focus on type safety"
  }'

Python (OpenAI SDK)

import openai

client = openai.OpenAI(
    api_key="Rodex",
    base_url="https://api-rodex-cli.vercel.app/api/v1"
)

response = client.chat.completions.create(
    model="rodex-llama-3.3-70b-versatile",
    messages=[
        {"role": "user", "content": "Explain async/await in JavaScript"}
    ]
)

Node.js (OpenAI SDK)

import OpenAI from 'openai';

const client = new OpenAI({
  apiKey: 'Rodex',
  baseURL: 'https://api-rodex-cli.vercel.app/api/v1'
});

const response = await client.chat.completions.create({
  model: 'rodex-grok-beta',
  messages: [
    { role: 'user', content: 'Write a React component' }
  ],
  custom_instructions: 'Use TypeScript and hooks'
});