dartpain commited on
Commit
8ff621d
·
1 Parent(s): 527e7f8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -3
README.md CHANGED
@@ -6,7 +6,7 @@ tags:
6
  - docsgpt
7
  ---
8
 
9
- DocsGPT-7B is a decoder-style transformer that is fine-tuned specifically for providing answers based on documentation given in context. It is an extension of the MosaicPretrainedTransformer (MPT) family, being fine-tuned from the MPT-7B model developed by MosaicML. The model inherits the powerful language understanding capabilities of MPT-7B and has been specialized for the purpose of documentation-oriented question answering.
10
 
11
  ## Model Description
12
 
@@ -27,8 +27,6 @@ License: Apache 2.0
27
 
28
  ## How to Use
29
 
30
- This model is best used with the MosaicML [llm-foundry repository](https://github.com/mosaicml/llm-foundry) for training and finetuning.
31
-
32
  ```python
33
  import transformers
34
  model = transformers.AutoModelForCausalLM.from_pretrained(
 
6
  - docsgpt
7
  ---
8
 
9
+ DocsGPT-7B is a decoder-style transformer that is fine-tuned specifically for providing answers based on documentation given in context. It is built on to of the MosaicPretrainedTransformer (MPT), being fine-tuned from the MPT-7B model developed by MosaicML. The model inherits the powerful language understanding capabilities of MPT-7B and has been specialized for the purpose of documentation-oriented question answering.
10
 
11
  ## Model Description
12
 
 
27
 
28
  ## How to Use
29
 
 
 
30
  ```python
31
  import transformers
32
  model = transformers.AutoModelForCausalLM.from_pretrained(