M-FAC commited on
Commit
047e663
1 Parent(s): e01fa4a

Add links for github repo

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -52,6 +52,9 @@ CUDA_VISIBLE_DEVICES=0 python run_qa.py \
52
 
53
  We believe these results could be improved with modest tuning of hyperparameters: `per_device_train_batch_size`, `learning_rate`, `num_train_epochs`, `num_grads` and `damp`. For the sake of fair comparison and a robust default setup we use the same hyperparameters across all models (`bert-tiny`, `bert-mini`) and all datasets (SQuAD version 2 and GLUE).
54
 
 
 
 
55
  ## BibTeX entry and citation info
56
 
57
  ```bibtex
 
52
 
53
  We believe these results could be improved with modest tuning of hyperparameters: `per_device_train_batch_size`, `learning_rate`, `num_train_epochs`, `num_grads` and `damp`. For the sake of fair comparison and a robust default setup we use the same hyperparameters across all models (`bert-tiny`, `bert-mini`) and all datasets (SQuAD version 2 and GLUE).
54
 
55
+ Our code for M-FAC can be found here: [https://github.com/IST-DASLab/M-FAC](https://github.com/IST-DASLab/M-FAC).
56
+ A step-by-step tutorial on how to integrate and use M-FAC with any repository can be found here: [https://github.com/IST-DASLab/M-FAC/tree/master/tutorials](https://github.com/IST-DASLab/M-FAC/tree/master/tutorials).
57
+
58
  ## BibTeX entry and citation info
59
 
60
  ```bibtex