add attribution
Browse files
README.md
CHANGED
@@ -14,6 +14,12 @@ datasets:
|
|
14 |
inference: false
|
15 |
---
|
16 |
|
|
|
|
|
|
|
|
|
|
|
|
|
17 |
# MPT-7B
|
18 |
|
19 |
MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code.
|
@@ -225,4 +231,4 @@ Please cite this model using the following format:
|
|
225 |
note = {Accessed: 2023-03-28}, % change this date
|
226 |
urldate = {2023-03-28} % change this date
|
227 |
}
|
228 |
-
```
|
|
|
14 |
inference: false
|
15 |
---
|
16 |
|
17 |
+
### Attribution
|
18 |
+
|
19 |
+
This model is derived from [MosaicML's MPT-7B model](https://huggingface.co/mosaicml/mpt-7b/tree/main), with changes from
|
20 |
+
[cekal/mpt-7b-peft-compatible](https://huggingface.co/cekal/mpt-7b-peft-compatible) applied; each licensed under the
|
21 |
+
Apache License, version 2.0.
|
22 |
+
|
23 |
# MPT-7B
|
24 |
|
25 |
MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code.
|
|
|
231 |
note = {Accessed: 2023-03-28}, % change this date
|
232 |
urldate = {2023-03-28} % change this date
|
233 |
}
|
234 |
+
```
|