Crashed?

#1
by Dietmar2020 - opened

INFO: ExllamaV2 version: 0.2.3
WARNING: Disabling authentication makes your instance vulnerable. Set the disable_auth flag to False in
config.yml if you want to share this instance with others.
INFO: Generation logging is disabled
WARNING: Draft model is disabled because a model name wasn't provided. Please check your config.yml!
WARNING: The given cache size (12000) is not a multiple of 256.
WARNING: Overriding cache_size with an overestimated value of 12032 tokens.
WARNING: The given cache_size (12032) is less than 2 * max_seq_len and may be too small for requests using CFG.
WARNING: Ignore this warning if you do not plan on using CFG.
INFO: Attempting to load a prompt template if present.
INFO: Using template "chatml_with_headers" for chat completions.
INFO: Loading model: /root/tabbyAPI/models/Qwen_2_5_32b_7Byte_Instruct
INFO: Loading with autosplit
Loading model modules ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0% 0/131 -:--:--./start.sh: line 24: 1098918 Segmentation fault (core dumped) python3 start.py "$@"
(base) root@PIRXSOFT2023:~/tabbyAPI#

Sign up or log in to comment