Update README.md
Browse files
README.md
CHANGED
@@ -126,7 +126,7 @@ For deployment, we recommend using vLLM. You can enable the long-context capabil
|
|
126 |
|
127 |
## Evaluation
|
128 |
|
129 |
-
We briefly compare Qwen2-72B-Instruct with
|
130 |
|
131 |
| Datasets | Llama-3-70B-Instruct | Qwen1.5-72B-Chat | **Qwen2-72B-Instruct** |
|
132 |
| :--- | :---: | :---: | :---: |
|
|
|
126 |
|
127 |
## Evaluation
|
128 |
|
129 |
+
We briefly compare Qwen2-72B-Instruct with similar-sized instruction-tuned LLMs, including our previous Qwen1.5-72B-Chat. The results are shown as follows:
|
130 |
|
131 |
| Datasets | Llama-3-70B-Instruct | Qwen1.5-72B-Chat | **Qwen2-72B-Instruct** |
|
132 |
| :--- | :---: | :---: | :---: |
|