Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.orq.ai/llms.txt

Use this file to discover all available pages before exploring further.

Overview

Droid is the Factory.ai CLI. A single ~/.factory/config.json file lets sessions mix Claude, GPT, Gemini, and any other AI Router-supported model, all routed through Orq.ai for unified tracing, cost tracking, and access controls.

Prerequisites

  • Droid CLI installed
  • Active Factory.ai account (free tier, no paid subscription required)
  • Active Orq.ai account
  • Orq.ai API key

Install Droid CLI

curl -fsSL https://app.factory.ai/cli | sh

Setup

1

Authenticate with Factory

droid login
Login is required. The free tier covers BYOK (bring-your-own-key) fully.
2

Clear conflicting environment variables

If ANTHROPIC_AUTH_TOKEN, ANTHROPIC_BASE_URL, or ANTHROPIC_API_KEY are set in your shell, they override config.json and requests bypass Orq.ai. Unset them before running Droid:
unset ANTHROPIC_AUTH_TOKEN ANTHROPIC_BASE_URL ANTHROPIC_API_KEY
3

Edit ~/.factory/config.json

Add a custom_models array with one entry per model. The example below registers Claude, GPT-4o, and Gemini all routed through Orq.ai:
{
  "custom_models": [
    {
      "model_display_name": "Claude Sonnet 4.5 via Orq",
      "model": "anthropic/claude-sonnet-4-5",
      "base_url": "https://api.orq.ai/v3/anthropic",
      "api_key": "<ORQ_API_KEY>",
      "provider": "anthropic",
      "max_tokens": 64000
    },
    {
      "model_display_name": "GPT-4o via Orq",
      "model": "openai/gpt-4o",
      "base_url": "https://api.orq.ai/v3/router",
      "api_key": "<ORQ_API_KEY>",
      "provider": "generic-chat-completion-api",
      "max_tokens": 16000
    },
    {
      "model_display_name": "Gemini 2.0 Flash via Orq",
      "model": "google/gemini-2.0-flash-001",
      "base_url": "https://api.orq.ai/v3/router",
      "api_key": "<ORQ_API_KEY>",
      "provider": "generic-chat-completion-api",
      "max_tokens": 8000
    }
  ]
}
Replace <ORQ_API_KEY> with your key from Workspace Settings → API Keys.
4

Start Droid and select a model

droid
Inside the session, use /model to switch between registered custom models.

Configuration Reference

Provider field

ValueBehaviour
"anthropic"Uses the Anthropic SDK, which appends /v1/messages to the base URL automatically. Point base_url to https://api.orq.ai/v3/anthropic.
"generic-chat-completion-api"Uses the OpenAI-compatible Chat Completions format. Point base_url to https://api.orq.ai/v3/router.
"openai"Hardcoded to api.openai.com. Ignores base_url. Do not use this value for Orq-routed models.

Base URL by provider type

Provider valueCorrect base_url
"anthropic"https://api.orq.ai/v3/anthropic
"generic-chat-completion-api"https://api.orq.ai/v3/router
Do not use "provider": "openai" for Orq.ai-routed models. It is hardcoded to api.openai.com and ignores base_url, so requests bypass Orq.ai entirely.

Troubleshooting

The "provider": "openai" value is hardcoded to api.openai.com and ignores base_url. Change provider to "generic-chat-completion-api" for all Orq-routed OpenAI-compatible models.
Check for conflicting environment variables. If ANTHROPIC_AUTH_TOKEN, ANTHROPIC_BASE_URL, or ANTHROPIC_API_KEY are set, they take precedence over config.json. Run unset ANTHROPIC_AUTH_TOKEN ANTHROPIC_BASE_URL ANTHROPIC_API_KEY and restart Droid.
Run droid login to authenticate with Factory.ai before using any model. Without a valid session the CLI will not start.
Confirm the api_key value in config.json is a valid Orq.ai API key (not an OpenAI or Anthropic key), and that base_url points to the correct Orq endpoint for the chosen provider type.