DesilDev commited on
Commit
c340eeb
1 Parent(s): 2649317

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -8
README.md CHANGED
@@ -3,9 +3,17 @@ license: apache-2.0
3
  base_model: DesilDev/t5-small-summery
4
  tags:
5
  - generated_from_trainer
 
 
6
  model-index:
7
  - name: Blocksmith
8
  results: []
 
 
 
 
 
 
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -13,22 +21,21 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  # Blocksmith
15
 
16
- This model is a fine-tuned version of [DesilDev/t5-small-summery](https://huggingface.co/DesilDev/t5-small-summery) on the None dataset.
 
17
 
18
  ## Model description
19
 
20
- More information needed
21
 
22
  ## Intended uses & limitations
23
 
24
- More information needed
25
-
26
- ## Training and evaluation data
27
-
28
- More information needed
29
 
30
  ## Training procedure
31
 
 
 
32
  ### Training hyperparameters
33
 
34
  The following hyperparameters were used during training:
@@ -53,4 +60,4 @@ The following hyperparameters were used during training:
53
  - Transformers 4.42.4
54
  - Pytorch 2.3.1+cu121
55
  - Datasets 2.20.0
56
- - Tokenizers 0.19.1
 
3
  base_model: DesilDev/t5-small-summery
4
  tags:
5
  - generated_from_trainer
6
+ - minecraft
7
+ - log_summariser
8
  model-index:
9
  - name: Blocksmith
10
  results: []
11
+ datasets:
12
+ - EdinburghNLP/xsum
13
+ language:
14
+ - en
15
+ metrics:
16
+ - code_eval
17
  ---
18
 
19
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
21
 
22
  # Blocksmith
23
 
24
+ # Training Procedure
25
+ The T5-small model was fine-tuned on the Minecraft log dataset and a text summarising dataset (Xsum) using the Adam optimizer with a learning rate of 2e-05 for 1 epoch. Early stopping was not implemented.
26
 
27
  ## Model description
28
 
29
+ Blocksmith is a natural language processing model designed to generate concise summaries of Minecraft logs. It is based on the Transformer architecture, specifically the T5-small model, and trained on a dataset of Minecraft logs.
30
 
31
  ## Intended uses & limitations
32
 
33
+ Blocksmith is intended for analyzing player behavior, identifying potential issues or bugs, and generating insights for game improvement. However, the model may have limitations in handling specific log formats or game versions, and its summaries might be biased towards the content of the training data.
 
 
 
 
34
 
35
  ## Training procedure
36
 
37
+ The T5-small model was fine-tuned on the Minecraft log dataset and a text summarising dataset (Xsum) using the Adam optimizer with a learning rate of 2e-05 for 1 epoch. Early stopping was not implemented.
38
+
39
  ### Training hyperparameters
40
 
41
  The following hyperparameters were used during training:
 
60
  - Transformers 4.42.4
61
  - Pytorch 2.3.1+cu121
62
  - Datasets 2.20.0
63
+ - Tokenizers 0.19.1