Update README.md
Browse files
README.md
CHANGED
@@ -124,7 +124,7 @@ For deployment, we recommend using vLLM. You can enable the long-context capabil
|
|
124 |
|
125 |
## Evaluation
|
126 |
|
127 |
-
We briefly compare Qwen2-7B-Instruct with
|
128 |
|
129 |
| Datasets | Llama-3-8B-Instruct | Yi-1.5-9B-Chat | GLM-4-9B-Chat | Qwen1.5-7B-Chat | Qwen2-7B-Instruct |
|
130 |
| :--- | :---: | :---: | :---: | :---: | :---: |
|
|
|
124 |
|
125 |
## Evaluation
|
126 |
|
127 |
+
We briefly compare Qwen2-7B-Instruct with similar-sized instruction-tuned LLMs, including Qwen1.5-7B-Chat. The results are shown below:
|
128 |
|
129 |
| Datasets | Llama-3-8B-Instruct | Yi-1.5-9B-Chat | GLM-4-9B-Chat | Qwen1.5-7B-Chat | Qwen2-7B-Instruct |
|
130 |
| :--- | :---: | :---: | :---: | :---: | :---: |
|