Model,Accuracy Qwen2-7B-Instruct,0.7727508202383008 Meta-Llama-3.1-8B-Instruct,0.5246934898981178 Gemma-2-9b-it-sg-ultrachat-sft,0.5615610429977551 Qwen2_5_32B_Instruct,0.8273182524607149 Qwen2_5_7B_Instruct,0.7486617164565705 Qwen2_5_1_5B_Instruct,0.5975651873596961 Qwen2-72B-Instruct,0.8293904334311863 MERALiON-LLaMA-3-8B-Chat,0.48877568640994645 Meta-Llama-3-8B-Instruct,0.4839405974788465 Meta-Llama-3.1-70B-Instruct,0.6814885166637886 Qwen2_5_3B_Instruct,0.6621481609393887 SeaLLMs-v3-7B-Chat,0.7684337765498187 Qwen2_5_72B_Instruct,0.8343982041098256 gemma-2-9b-it,0.5700224486271801 Meta-Llama-3-70B-Instruct,0.6494560524952513 Qwen2_5_14B_Instruct,0.7807805214988776 gemma2-9b-cpt-sea-lionv3-instruct,0.5796062856156105 gemma-2-2b-it,0.4412882058366431 llama3-8b-cpt-sea-lionv2-instruct,0.48929373165256435 Qwen2_5_0_5B_Instruct,0.42056639613192887 GPT4o_0513,0.7414954239336902