Jump to Content
orq.aiHomepageAdmin PortalSign up
ResourcesAPI ReferenceChangelog
HomepageAdmin PortalSign upLog Inorq.ai
Resources
Log In
ResourcesAPI ReferenceChangelog

Documentation

  • Introduction
  • Quick Start
  • Core Concepts

Reference

  • Contact
    • Creating a Contact
    • Using Contact Metrics
  • Dataset
    • Creating a Dataset
    • Creating a Curated Dataset
  • Deployment
    • Creating a Deployment
    • Integrating a Deployment
    • Deployment Versioning
    • Deployment Routing
    • Evaluators & Guardrails in Deployments
    • Deployment Cache
    • Deployment Security and Privacy
    • Including Knowledge Base Retrievals in Invoke
    • Attaching files in a Deployment
  • Evaluator
    • Evaluator Library
    • Creating an Evaluator
    • Function Evaluator
    • HTTP Evaluator
      • Creating an HTTP Evaluator
    • JSON Evaluator
      • Creating a JSON Evaluator
    • LLM Evaluator
      • Creating an LLM Evaluator
    • Python Evaluator
      • Creating a Python Evaluator
    • Ragas Evaluator
  • Experiment
    • Creating an Experiment
    • Using Evaluator in Experiment
    • Running an Experiment
    • Exporting an Experiment
  • Feedback
    • Adding Feedback to Generations
    • Making Corrections on a Generation
    • Feedback Types
  • Hub
  • Integration
    • Adding an Integration
      • Azure OpenAI
      • Amazon Bedrock
      • Google Vertex AI
    • API keys & Endpoints
  • Knowledge Base
    • Creating a Knowledge Base
    • Creating a Knowledge Base Programmatically
    • Chunking Strategy
    • Retrieval Settings
    • Observability
    • Agentic RAG
  • Logs
  • Metrics Dashboard
  • Model Garden
    • Using the Model Garden
    • AI Proxy
  • Playground
    • Creating a Playground
  • Prompt
    • Creating a Prompt
    • Model Parameters
    • Using a Knowledge Base in a Prompt
    • Using a Tool in a Prompt
    • Using Image Generation in a Prompt
    • Using a Prompt Snippet in a Prompt
    • Using Vision in a Prompt
    • Using the Prompt Generator
    • Using Response Format
  • Prompt Snippet
    • Creating a Prompt Snippet
  • Tool
    • Creating a Tool
  • Traces
  • Webhook
    • Creating a Webhook
    • Webhook Events
    • Webhook Security & Validation
    • Webhook Best Practices

Administer

  • Projects
  • Context Attributes
  • Permissions
    • Using Permissions
  • Billing & Usage

COOKBOOKS

  • Intent Classification with Orq.ai
  • Using Third Party Vector Databases with Orq.ai
  • Data Extraction from PDF
  • Creating SQL Queries from Natural Language
  • Image-Based Receipt Extraction with Orq
  • Multilingual FAQ Bot Using RAG with Orq.ai
  • Capturing and Leveraging User Feedback in Orq.ai
  • Chaining Deployments and Running Evaluations

Learn

  • LLM Glossary
  • Model Parameters
  • Message Roles
  • Models Prompting Formatting Guidelines
  • Prompt Engineering Best Practices
Powered by 

Prompt Snippet

Suggest Edits

Prompt Snippets are saved text to be used within your prompts. This is useful for text that you want to appear within multiple prompts. Furthermore, this lets you edit one snippet to update multiple prompts at the same time.

To get started see:

  • Creating a Prompt Snippet.
  • Using a Prompt Snippet in a Prompt

Updated 7 days ago


What’s Next
  • Creating a Prompt Snippet
  • Using a Prompt Snippet in a Prompt