doberst commited on
Commit
e648951
·
1 Parent(s): e07e1a8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -86,6 +86,7 @@ The fastest way to get started with BLING is through direct import in transforme
86
  tokenizer = AutoTokenizer.from_pretrained("llmware/bling-1b-0.1")
87
  model = AutoModelForCausalLM.from_pretrained("llmware/bling-1b-0.1")
88
 
 
89
 
90
  The BLING model was fine-tuned with a simple "\<human> and \<bot> wrapper", so to get the best results, wrap inference entries as:
91
 
@@ -122,7 +123,6 @@ If you are using a HuggingFace generation script:
122
 
123
  output_only = tokenizer.decode(outputs[0][start_of_output:],skip_special_tokens=True)
124
 
125
- Please also refer to two sample test scripts in the files repository for full examples.
126
 
127
  ## Citation [optional]
128
 
 
86
  tokenizer = AutoTokenizer.from_pretrained("llmware/bling-1b-0.1")
87
  model = AutoModelForCausalLM.from_pretrained("llmware/bling-1b-0.1")
88
 
89
+ Please refer to the two tester .py files in the Files repository, which includes 200 samples and script to test the model.
90
 
91
  The BLING model was fine-tuned with a simple "\<human> and \<bot> wrapper", so to get the best results, wrap inference entries as:
92
 
 
123
 
124
  output_only = tokenizer.decode(outputs[0][start_of_output:],skip_special_tokens=True)
125
 
 
126
 
127
  ## Citation [optional]
128