added
Llama 3 on Perplexity
7 months ago by Cormick Marskamp
Start using the best open source model, hosted on perplexity, in Orq.
You could already use the Llama 3 models hosted on Anyscale and Groq. However, being able to use Llama-3 on Perplexity opens up new possibilities and use cases.
Because the model is able to go online, it is able to:
- Generate up-to-date responses
- Retrieve dynamic data about the latest news etc.
- Understand real-world context better
The example below showcases that only Llama-3 on perplexity is able to generate the current temperature in Amsterdam.