Closed
Description
As recently as a few days ago, Command-R (and presumably R+) could be converted with convert-hf-to-gguf.py. I double checked and conversion completes successfully in b2751. However, with the recent changes to accommodate Llama3, Command-R compatibility has been broken. Trying to convert today with b2777 I get
raise NotImplementedError("BPE pre-tokenizer was not recognized - update get_vocab_base_pre()")
NotImplementedError: BPE pre-tokenizer was not recognized - update get_vocab_base_pre()
I know that L3 required a new tokenizer provided by meta to facilitate proper conversion. Do we require something new from cohere, or is this something that can be fixed internally?