davda54 commited on
Commit
ab095bd
·
verified ·
1 Parent(s): 67e3212

More precise computation of theoretical FLOPs

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -100,7 +100,7 @@ More details about the evaluation setup and the new Norwegian benchmarks will be
100
  - Training precision: bfloat16
101
  - Hardware: 256 AMD MI250X GPUs (128 GB)
102
  - Training time: 8.5 days
103
- - Theoretical computation: 1.7e22 FLOP/s
104
  - Model FLOP/s utilization (MFU): 38%
105
 
106
  **Unique Features:**
 
100
  - Training precision: bfloat16
101
  - Hardware: 256 AMD MI250X GPUs (128 GB)
102
  - Training time: 8.5 days
103
+ - Theoretical computation: 2.0e22 FLOP/s
104
  - Model FLOP/s utilization (MFU): 38%
105
 
106
  **Unique Features:**