Models #16
wiseman-timelord
started this conversation in
General
Models
#16
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
The model to use is Vicuna, as it has best objective information score, slightly better than even GPT4-X-Alpaca. Vicuna is now at v1.1, and I can verify "ggml-vicuna-13b-1.1-q4_1.bin" from "https://huggingface.co/eachadea/ggml-vicuna-13b-1.1" is compatible.
Some versions of GPT4-X-Alpaca also work, specifically I know "gpt4-x-alpaca-13b-native-ggml-model-q4_0" from "https://huggingface.co/Bradarr/gpt4-x-alpaca-13b-native-ggml-model-q4_0" works, but, didn't take too long over testing, for reasons stated.
The 1.1 Vicuna seems better than the 1.0 for AutoLLM, from short review.
Beta Was this translation helpful? Give feedback.
All reactions