Hyperlinking
With the new Hyperlinking feature, you are able to take your use case from one module to the other. Switching between the Playground, Experiments, and Deployments allows you to make quick iterations throughout the whole platform. Whether you want to take your Playground setup to Experiments or your Deployment to Playground, it's all possible.
Use Groq as your LLM provider
You can now use Groq as your LLM provider. It currently hosts the following models: llama2-70b-chat, mixtral-8x7b-32768, and Gemma-7b-it.
Make your own custom LLM Evaluator
You can now create your own custom LLM evalutator. This allows you to go beyond the standard evaluators like BLEU and Valid JSON.
Use LLM as a Reference in Experiments
With this new feature, you're able to use the output of a large language model like GPT-4 as the reference for another model like Gemma-7b and Mistal-large (see image).
New Providers & Models: Mistral large, Perplexity, Gemma 7b and other models are added
Check out the newly added models on orq.ai. You can find them in the model garden.
Import from Resources in Experiments
Instead of manually adding your data sets or uploading them through a CSV file in experiments, you can now import them from the files you stored in Resources. This allows you to store and access your files in a quick and efficient manner.
Resource management
Save your data sets, variables, and evaluators in the newly added resources tab.
Evaluators are available on our Platform
We have added Evaluators to our platform. With a wide range of industry-standard metrics and other relevant evaluators, you can check whether or not the output of your LLM is accurate, reliable, and contextually relevant.
Python and Node SDK's improvements
In our latest SDK update, we're thrilled to share a series of enhancements that significantly boost the performance and capabilities of our platform.
Setup your own API key with the new AnyScale integration.
You could already select models such as Llama from Meta and Mixtral from Mistral in the model garden. But with this release, it is now possible to connect your own API key for AnyScale. This way you can use your own account and rate limits without having to rely on a shared key. Soon you'll be able to use your own private models and finetuning on AnyScale.