update README.md
Browse files
README.md
CHANGED
@@ -23,8 +23,6 @@ Qwen2.5-Coder is the latest series of Code-Specific Qwen large language models (
|
|
23 |
|
24 |
- Significantly improvements in **code generation**, **code reasoning** and **code fixing**. Base on the strong Qwen2.5, we scale up the training tokens into 5.5 trillion including source code, text-code grounding, Synthetic data, etc. Qwen2.5-Coder-32B has become the current state-of-the-art open-source codeLLM, with its coding abilities matching those of GPT-4o.
|
25 |
- A more comprehensive foundation for real-world applications such as **Code Agents**. Not only enhancing coding capabilities but also maintaining its strengths in mathematics and general competencies.
|
26 |
-
- **Long-context Support** up to 128K tokens.
|
27 |
-
|
28 |
|
29 |
**This repo contains the 3B Qwen2.5-Coder model**, which has the following features:
|
30 |
- Type: Causal Language Models
|
@@ -62,10 +60,10 @@ If you find our work helpful, feel free to give us a cite.
|
|
62 |
|
63 |
```
|
64 |
@article{hui2024qwen2,
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
}
|
70 |
@article{qwen2,
|
71 |
title={Qwen2 Technical Report},
|
|
|
23 |
|
24 |
- Significantly improvements in **code generation**, **code reasoning** and **code fixing**. Base on the strong Qwen2.5, we scale up the training tokens into 5.5 trillion including source code, text-code grounding, Synthetic data, etc. Qwen2.5-Coder-32B has become the current state-of-the-art open-source codeLLM, with its coding abilities matching those of GPT-4o.
|
25 |
- A more comprehensive foundation for real-world applications such as **Code Agents**. Not only enhancing coding capabilities but also maintaining its strengths in mathematics and general competencies.
|
|
|
|
|
26 |
|
27 |
**This repo contains the 3B Qwen2.5-Coder model**, which has the following features:
|
28 |
- Type: Causal Language Models
|
|
|
60 |
|
61 |
```
|
62 |
@article{hui2024qwen2,
|
63 |
+
title={Qwen2. 5-Coder Technical Report},
|
64 |
+
author={Hui, Binyuan and Yang, Jian and Cui, Zeyu and Yang, Jiaxi and Liu, Dayiheng and Zhang, Lei and Liu, Tianyu and Zhang, Jiajun and Yu, Bowen and Dang, Kai and others},
|
65 |
+
journal={arXiv preprint arXiv:2409.12186},
|
66 |
+
year={2024}
|
67 |
}
|
68 |
@article{qwen2,
|
69 |
title={Qwen2 Technical Report},
|