Samuel J. Huskey commited on
Commit
1a13472
·
1 Parent(s): 3f1798a

update readme with emissions information

Browse files
Files changed (1) hide show
  1. README.md +33 -1
README.md CHANGED
@@ -31,4 +31,36 @@ Achieving accuracy and reliability in this process will make the second goal of
31
 
32
  ## The Model
33
 
34
- After preliminary experiments with sequential neural network models using [bag-of-words](https://en.wikipedia.org/wiki/Bag-of-words_model), [term frequency-inverse document frequency](https://en.wikipedia.org/wiki/Tf%E2%80%93idf) (tf-idf), and custom word embedding encoding, I settled on using a pretrained BERT model developed by [Devlin et al. 2018](https://arxiv.org/abs/1810.04805v2). Specifically, I'm using [Hugging Face's DistilBert base multilingual (cased) model](https://huggingface.co/distilbert/distilbert-base-multilingual-cased), which is based on work by [Sanh et al. 2020](https://doi.org/10.48550/arXiv.1910.01108).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
31
 
32
  ## The Model
33
 
34
+ After preliminary experiments with sequential neural network models using [bag-of-words](https://en.wikipedia.org/wiki/Bag-of-words_model), [term frequency-inverse document frequency](https://en.wikipedia.org/wiki/Tf%E2%80%93idf) (tf-idf), and custom word embedding encoding, I settled on using a pretrained BERT model developed by [Devlin et al. 2018](https://arxiv.org/abs/1810.04805v2). Specifically, I'm using [Hugging Face's DistilBert base multilingual (cased) model](https://huggingface.co/distilbert/distilbert-base-multilingual-cased), which is based on work by [Sanh et al. 2020](https://doi.org/10.48550/arXiv.1910.01108).
35
+
36
+ ## Emissions
37
+
38
+ Here is the `codecarbon` output from training on Google Colab with an A100 runtime:
39
+
40
+ ```properties
41
+ timestamp: 2024-12-23T17:37:16
42
+ project_name: codecarbon
43
+ run_id: a2b8975b-512b-4158-b41f-2a00d1d6fb39
44
+ experiment_id: 5b0fa12a-3dd7-45bb-9766-cc326314d9f1
45
+ duration (seconds): 877.531339527
46
+ emissions (kilograms of carbon): 0.0260658391490936
47
+ emissions_rate (kg/sec): 2.970359914797282e-05
48
+ cpu_power (average in watts): 42.5
49
+ gpu_power (average in watts): 71.5115170414632
50
+ ram_power (average in watts): 31.30389261245728
51
+ cpu_energy (total watts): 0.0103517333061409
52
+ gpu_energy (total watts): 0.03961337474623
53
+ ram_energy (total watts): 0.007623585574942
54
+ energy_consumed (total watts): 0.057588693627313
55
+ os: Linux-6.1.85+-x86_64-with-glibc2.35
56
+ python_version: 3.10.12
57
+ codecarbon_version: 2.8.2
58
+ cpu_count: 12
59
+ cpu_model: Intel(R) Xeon(R) CPU @ 2.20GHz
60
+ gpu_count: 1
61
+ gpu_model: 1 x NVIDIA A100-SXM4-40GB
62
+ ram_total_size: 83.47704696655273
63
+ tracking_mode: machine
64
+ on_cloud: N
65
+ pue: 1.0
66
+ ```