File size: 588 Bytes
52ceadb |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
---
license: llama3
---
# Weights from the Llama-3-8B Self-Align Experiments
[WEIGHTS TO BE UPLOADED ONCE DONE]
## Training Config
The `config.yaml` should be used during `accelerate launch`, and `run.sh` was used to launch the training using the [StarCoder2 Self-Align training script](https://github.com/bigcode-project/starcoder2-self-align?tab=readme-ov-file#training-details).
Some tweaks were performed to get this working on 48GB vRAM:
- FSDP was used
- `per_device_batch_size` is `2`
- A learning rate of 3e-6 was used
## Environment:
- Trained with 2x4090 GPUs
- 128GB RAM |