Create added_tokens.json
#5
by
8andage
- opened
Copied from alpaca-13B and native, solves an error message when converting to ggml files
Exception: Vocab size mismatch (model has 32001, but models/chavinlo_gpt4-x-alpaca/tokenizer.model has 32000)
chavinlo
changed pull request status to
merged