Replies: 1 comment 1 reply
-
@krecicki can you link the huggingface for the model you're using? I thnk there may actually be a bug though in the zephyr chat format. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have gone through https://github.com/abetlen/llama-cpp-python/blob/main/llama_cpp/llama_chat_format.py#L399
For tiny-llama-1.1b-v1.0Chat openbuddy works good. However in this video https://www.youtube.com/watch?v=UIuUj3yb640 he uses the format <|system|>\nYou are an helpful assistant.\n<|user|>\write a text about upx.\n<|assistant|> and it performs way better.
How can I make a new format to make this work for tiny-llama? I have taken openbuddy and made a version for tiny-llama. However, I updated the llama_chat_format.py file and Invalid chat handler error occurs. I cannot seem to figure out what to do here to get it to use my custom code?
So far I have this update, am I missing anything here..
Strange enough OpenBuddy works better than any of the others. I even made this custom one. It doesn't work as well. What is going on here?
Beta Was this translation helpful? Give feedback.
All reactions