Langchain

Prerequisite

To start using Langchain with Orq, you need an API Key ready within your Orq.ai account.

📘

To setup your API key, see API keys & Endpoints.


Using Orq.ai as Proxy

While using the Langchain SDK, set the Base URL to the Orq.ai Proxy to feed calls through our API without changing any other part of your code.

Using the Orq.ai Proxy, you benefit from the Platform Traces, Cost and Usage Monitoring, keeping full compatibility and unified API with all models while using the Langchain SDK.

base_url: https://api.orq.ai/v2/proxy

api_key: your Orq API key

import os
from langchain.chat_models import init_chat_model

model = init_chat_model(api_key=os.getenv("ORQ_API_KEY"),
                        base_url="https://api.orq.ai/v2/proxy",
                        model="openai/gpt-4o-mini",
                        model_provider="openai")

print(model.invoke("You are a helpful assistant"))