dartpain commited on
Commit
527e7f8
·
1 Parent(s): 3c10c11

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -6
README.md CHANGED
@@ -10,11 +10,13 @@ DocsGPT-7B is a decoder-style transformer that is fine-tuned specifically for pr
10
 
11
  ## Model Description
12
 
13
- Architecture: Decoder-style Transformer
14
- Language: English
15
- Training data: Fine-tuned on approximately 1000 high-quality examples of documentation answering workflows.
16
- Base model: Fine-tuned version of MPT-7B, which is pretrained from scratch on 1T tokens of English text and code.
17
- License: Apache 2.0
 
 
18
 
19
  ## Features
20
 
@@ -30,7 +32,7 @@ This model is best used with the MosaicML [llm-foundry repository](https://githu
30
  ```python
31
  import transformers
32
  model = transformers.AutoModelForCausalLM.from_pretrained(
33
- 'mosaicml/mpt-7b',
34
  trust_remote_code=True
35
  )
36
  ```
 
10
 
11
  ## Model Description
12
 
13
+ Architecture: Decoder-style Transformer
14
+
15
+ Training data: Fine-tuned on approximately 1000 high-quality examples of documentation answering workflows.
16
+
17
+ Base model: Fine-tuned version of [MPT-7B](https://huggingface.co/mosaicml/mpt-7b), which is pretrained from scratch on 1T tokens of English text and code.
18
+
19
+ License: Apache 2.0
20
 
21
  ## Features
22
 
 
32
  ```python
33
  import transformers
34
  model = transformers.AutoModelForCausalLM.from_pretrained(
35
+ 'Arc53/DocsGPT-7B',
36
  trust_remote_code=True
37
  )
38
  ```