Replies: 3 comments
-
Extraction prompts: https://github.com/shcherbak-ai/contextgem/tree/main/contextgem/internal/prompts You can override the default system message and prompt templates, but it's not recommended, as they are optimized for extraction. For example, any custom prompts must contain the same conditions and variables, as well as the same output structures, as the default prompts. And if they don't, the functionality will be affected. To override the default prompts, you can use llm._update_default_prompt() method. And to override the default system message, you can pass your own when creating an LLM in If you need to use your own prompts, you can use frameworks like LangChain, where custom prompts are required. ContextGem handles prompts under the hood, for better efficiency and ease of use. |
Beta Was this translation helpful? Give feedback.
-
Thank you for your answer. I would also like to ask, will using SAT for sentence segmentation affect the performance of extraction if the official language of the document I am processing is not English? |
Beta Was this translation helpful? Give feedback.
-
SaT models have a strong performance across 85 languages, so using SaT should not affect extraction performance for non-English texts. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
May I ask what prompt template is used in this framework, and can I use my own defined prompt template?
Beta Was this translation helpful? Give feedback.
All reactions