Render example as Python code
Browse files
README.md
CHANGED
@@ -65,7 +65,7 @@ Remember to apply the conversation template of Llama 3 - not doing so might lead
|
|
65 |
|
66 |
## Quickstart (HF Transformers):
|
67 |
|
68 |
-
```
|
69 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
70 |
device = "cuda" # the device to load the model onto
|
71 |
|
|
|
65 |
|
66 |
## Quickstart (HF Transformers):
|
67 |
|
68 |
+
```python
|
69 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
70 |
device = "cuda" # the device to load the model onto
|
71 |
|