Fix code snippet
#3
by
pcuenq
HF staff
- opened
README.md
CHANGED
@@ -30,7 +30,7 @@ export HF_HUB_ENABLE_HF_TRANSFER=1
|
|
30 |
huggingface-cli download --local-dir CodeLlama-7b-mlx mlx-llama/CodeLlama-7b-mlx
|
31 |
|
32 |
# Run example
|
33 |
-
python mlx-examples/llama/llama.py CodeLlama-7b-mlx/ CodeLlama-7b-mlx/tokenizer.model
|
34 |
```
|
35 |
|
36 |
Please, refer to the [original model card](https://github.com/facebookresearch/codellama/blob/main/MODEL_CARD.md) for details on CodeLlama.
|
|
|
30 |
huggingface-cli download --local-dir CodeLlama-7b-mlx mlx-llama/CodeLlama-7b-mlx
|
31 |
|
32 |
# Run example
|
33 |
+
python mlx-examples/llama/llama.py --prompt "int main(char argc, char **argv) {" CodeLlama-7b-mlx/ CodeLlama-7b-mlx/tokenizer.model
|
34 |
```
|
35 |
|
36 |
Please, refer to the [original model card](https://github.com/facebookresearch/codellama/blob/main/MODEL_CARD.md) for details on CodeLlama.
|