Nessii013 commited on
Commit
307ac22
·
verified ·
1 Parent(s): 995e282

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -84,7 +84,9 @@ model = AutoModelForCausalLM.from_pretrained("uiuc-convai/CALM-8B")
84
  ```
85
 
86
  ### 🛠 Example Oumi Inference
87
- CALM-405B likely requires multi-node inference as most single nodes support up to 640GB of GPU VRAM. To run multi-node inference, we recommend [vLLM](https://docs.vllm.ai/en/latest/serving/distributed_serving.html)
 
 
88
 
89
  ### 🛠 Example Oumi Fine-Tuning
90
  ```bash
 
84
  ```
85
 
86
  ### 🛠 Example Oumi Inference
87
+ Oumi multi-node inference support is under development.
88
+ CALM-405B likely requires multi-node inference as most single nodes support up to 640GB of GPU VRAM.
89
+ To run multi-node inference, we recommend [vLLM](https://docs.vllm.ai/en/latest/serving/distributed_serving.html)
90
 
91
  ### 🛠 Example Oumi Fine-Tuning
92
  ```bash