arielnlee commited on
Commit
652558a
1 Parent(s): 583e121

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -2
README.md CHANGED
@@ -45,7 +45,7 @@ We use state-of-the-art [Language Model Evaluation Harness](https://github.com/E
45
 
46
  `garage-bAInd/Platypus2-70B` trained using STEM and logic based dataset [`garage-bAInd/Open-Platypus`](https://huggingface.co/datasets/garage-bAInd/Open-Platypus).
47
 
48
- Please see our [paper](https://platypus-llm.github.io/Platypus.pdf) and [project webpage](https://platypus-llm.github.io) for additional information.
49
 
50
  ### Training Procedure
51
 
@@ -92,7 +92,14 @@ Llama 2 and fine-tuned variants are a new technology that carries risks with use
92
  Please see the Responsible Use Guide available at https://ai.meta.com/llama/responsible-use-guide/
93
 
94
  ### Citations
95
-
 
 
 
 
 
 
 
96
  ```bibtex
97
  @misc{touvron2023llama,
98
  title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
 
45
 
46
  `garage-bAInd/Platypus2-70B` trained using STEM and logic based dataset [`garage-bAInd/Open-Platypus`](https://huggingface.co/datasets/garage-bAInd/Open-Platypus).
47
 
48
+ Please see our [paper](https://arxiv.org/abs/2308.07317) and [project webpage](https://platypus-llm.github.io) for additional information.
49
 
50
  ### Training Procedure
51
 
 
92
  Please see the Responsible Use Guide available at https://ai.meta.com/llama/responsible-use-guide/
93
 
94
  ### Citations
95
+ ```bibtex
96
+ @article{platypus2023,
97
+ title={Platypus: Quick, Cheap, and Powerful Refinement of LLMs},
98
+ author={Ariel N. Lee and Cole J. Hunter and Nataniel Ruiz},
99
+ booktitle={arXiv preprint arxiv:2308.07317},
100
+ year={2023}
101
+ }
102
+ ```
103
  ```bibtex
104
  @misc{touvron2023llama,
105
  title={Llama 2: Open Foundation and Fine-Tuned Chat Models},