-
Couldn't load subscription status.
- Fork 768
Closed
Labels
Description
What Would You Like to See with the Gateway?
Open source models like Llama and mistral expect instruction and completion inputs in a chat template format for optimal text completions
example:
<|begin_of_text|>\\n<|start_header_id|>user<|end_header_id|>\\nCountry: United States\\nCapital: <|eot_id|><|start_header_id|>assistant<|end_header_id|>\\nWashington DC<|eot_id|><|start_header_id|>user<|end_header_id|>\\nWhat is up my good friend?<|eot_id|><|start_header_id|>assistant<|end_header_id|>\\n
use the following for reference:
https://www.llama.com/docs/model-cards-and-prompt-formats/llama3_1
https://medium.com/@marketing_novita.ai/how-to-use-mistral-chat-template-e0b2a973f031
Portkey's current implementation does not apply the templates, rather appends roles this way
transform: (params: Params) => {
let prompt: string = '';
if (!!params.messages) {
let messages: Message[] = params.messages;
messages.forEach((msg, index) => {
if (index === 0 && msg.role === 'system') {
prompt += `system: ${msg.content}\n`;
} else if (msg.role == 'user') {
prompt += `user: ${msg.content}\n`;
} else if (msg.role == 'assistant') {
prompt += `assistant: ${msg.content}\n`;
} else {
prompt += `${msg.role}: ${msg.content}\n`;
}
});
prompt += 'Assistant:';
}
return prompt;
},
Context for your Request
No response
Your Twitter/LinkedIn
No response