Python and Node SDK's improvements
In our latest SDK update, we're thrilled to share a series of enhancements that significantly boost the performance and capabilities of our platform.
Support for messages in the invoke and stream functionalities
Now you can provide a new property in the invoke
and the invoke with stream
methods of both SDKs to combine the messages with the prompt configured in Orquesta.
const deployment = await client.deployments.invoke({
key: 'customer_service',
messages: [
{
role: 'user',
message:
'A customer is asking about the latest software update features. Generate a detailed and informative response highlighting the key new features and improvements in the latest update.',
},
],
context: { environments: 'production', country: 'NLD' },
inputs: { firstname: 'John', city: 'New York' },
metadata: { customer_id: 'Qwtqwty90281' },
});
console.log(deployment?.choices[0].message.content);
The introduction of the new property significantly enhances various aspects of interaction:
- Enhanced Contextual Clarity: The use of chat history empowers the model to preserve the context throughout a conversation. This feature ensures that each response is not only coherent but also directly relevant, as the model can draw upon past dialogues to fully grasp the nuances of the current inquiry or discussion topic.
- Streamlined Conversation Flow: Chat history is instrumental in maintaining a consistent and logical flow in conversations. This prevents the occurrence of repetitive or conflicting responses, mirroring the natural progression of human dialogues and maintaining conversational integrity.
- Tailored User Interactions: With access to previous interactions, chat history allows the model to customize its responses according to individual user preferences and historical queries. This level of personalization significantly boosts user engagement and satisfaction, leading to more effective and enjoyable communication experiences.
Support for messages and choices to add metrics
In Orquesta, after every request, you can add custom metrics. On top of the superset that we already support, we also added two new properties, messages
and choices
. The messages
property is designed to log the communication exchange between the user and the model, capturing the entire conversation history for context and analysis. Meanwhile, the choices
property records the different response options that the model considers or generates before presenting the final output, providing insights into the model's decision-making process.
deployment.addMetrics({
messages: [
{
role: 'user',
message:
'A customer is asking about the latest software update features. Generate a detailed and informative response highlighting the key new features and improvements in the latest update.',
},
],
choices: [
{
index: 0,
finish_reason: 'stop',
message: {
role: 'assistant',
content:
"Dear customer: Thank you for your interest in our latest software update! We're excited to share with you the new features and improvements we've rolled out. Here's what you can look forward to in this update",
},
},
],
});
Support for OpenAI system fingerprint
To help track OpenAI environmental or model changes, OpenAI is exposing a system_fingerprint
parameter and value. If this value changes, different outputs will be generated due to changes to the OpenAI environment.
In the new version of the SDK, if you are using an OpenAI model and the invoke
method, the system_fingerprint
will be exposed in the deployment properties.