kernelmachine commited on
Commit
f1bfc86
·
1 Parent(s): 10dd669

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -11
README.md CHANGED
@@ -73,23 +73,23 @@ This model can be used for prompting for evaluation of downstream tasks as well
73
 
74
  You can use this model directly with a pipeline for text generation.
75
 
76
- from transformers import pipeline
77
 
78
- ```
79
- generator = pipeline('text-generation', model="facebook/opt-350m")
80
- generator("Hello, I'm am conscious and")
81
- [{'generated_text': "Hello, I'm am conscious and I'm a bit of a noob. I'm looking for"}]
 
82
  ```
83
 
84
  By default, generation is deterministic. In order to use the top-k sampling, please set do_sample to True.
85
 
86
- from transformers import pipeline, set_seed
87
 
88
- ```
89
- set_seed(32)
90
- generator = pipeline('text-generation', model="facebook/opt-350m", do_sample=True)
91
- generator("Hello, I'm am conscious and")
92
- [{'generated_text': "Hello, I'm am conscious and I'm interested in this project
 
93
  ```
94
 
95
  ### Limitations and Bias
 
73
 
74
  You can use this model directly with a pipeline for text generation.
75
 
 
76
 
77
+ ```python
78
+ from transformers import pipeline
79
+ generator = pipeline('text-generation', model="kernelmachine/silo-pd-1.3b", device='cuda')
80
+ generator("Hello")
81
+ [{'generated_text': 'Hello, my dear," said the old man, "I have been waiting for you\na long'}]
82
  ```
83
 
84
  By default, generation is deterministic. In order to use the top-k sampling, please set do_sample to True.
85
 
 
86
 
87
+ ```python
88
+ from transformers import pipeline, set_seed
89
+ set_seed(42)
90
+ generator = pipeline('text-generation', model="kernelmachine/silo-pd-1.3b", device='cuda', do_sample=True)
91
+ generator("Hello")
92
+ [{'generated_text': 'Hello, Mother," he called.\n\n"Hello, Son. Have you got a car'}]
93
  ```
94
 
95
  ### Limitations and Bias