kernelmachine
commited on
Commit
·
f1bfc86
1
Parent(s):
10dd669
Update README.md
Browse files
README.md
CHANGED
@@ -73,23 +73,23 @@ This model can be used for prompting for evaluation of downstream tasks as well
|
|
73 |
|
74 |
You can use this model directly with a pipeline for text generation.
|
75 |
|
76 |
-
from transformers import pipeline
|
77 |
|
78 |
-
```
|
79 |
-
|
80 |
-
generator("
|
81 |
-
|
|
|
82 |
```
|
83 |
|
84 |
By default, generation is deterministic. In order to use the top-k sampling, please set do_sample to True.
|
85 |
|
86 |
-
from transformers import pipeline, set_seed
|
87 |
|
88 |
-
```
|
89 |
-
set_seed
|
90 |
-
|
91 |
-
generator("
|
92 |
-
|
|
|
93 |
```
|
94 |
|
95 |
### Limitations and Bias
|
|
|
73 |
|
74 |
You can use this model directly with a pipeline for text generation.
|
75 |
|
|
|
76 |
|
77 |
+
```python
|
78 |
+
from transformers import pipeline
|
79 |
+
generator = pipeline('text-generation', model="kernelmachine/silo-pd-1.3b", device='cuda')
|
80 |
+
generator("Hello")
|
81 |
+
[{'generated_text': 'Hello, my dear," said the old man, "I have been waiting for you\na long'}]
|
82 |
```
|
83 |
|
84 |
By default, generation is deterministic. In order to use the top-k sampling, please set do_sample to True.
|
85 |
|
|
|
86 |
|
87 |
+
```python
|
88 |
+
from transformers import pipeline, set_seed
|
89 |
+
set_seed(42)
|
90 |
+
generator = pipeline('text-generation', model="kernelmachine/silo-pd-1.3b", device='cuda', do_sample=True)
|
91 |
+
generator("Hello")
|
92 |
+
[{'generated_text': 'Hello, Mother," he called.\n\n"Hello, Son. Have you got a car'}]
|
93 |
```
|
94 |
|
95 |
### Limitations and Bias
|