File size: 3,806 Bytes
a9ff985
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
---
base_model: meta-llama/Meta-Llama-3.1-8B-Instruct
library_name: peft
license: llama3.1
tags:
- unsloth
- generated_from_trainer
model-index:
- name: profile
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# profile

This model is a fine-tuned version of [meta-llama/Meta-Llama-3.1-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2265

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.02
- num_epochs: 0.6

### Training results

| Training Loss | Epoch  | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 1.4834        | 0.0130 | 2    | 1.3970          |
| 1.2584        | 0.0259 | 4    | 1.3753          |
| 1.2986        | 0.0389 | 6    | 1.3372          |
| 1.3462        | 0.0518 | 8    | 1.3056          |
| 1.2461        | 0.0648 | 10   | 1.2892          |
| 1.263         | 0.0777 | 12   | 1.2828          |
| 1.2749        | 0.0907 | 14   | 1.2781          |
| 1.2803        | 0.1036 | 16   | 1.2702          |
| 1.1367        | 0.1166 | 18   | 1.2617          |
| 1.3358        | 0.1296 | 20   | 1.2531          |
| 1.1804        | 0.1425 | 22   | 1.2464          |
| 1.1444        | 0.1555 | 24   | 1.2440          |
| 1.1772        | 0.1684 | 26   | 1.2425          |
| 1.2582        | 0.1814 | 28   | 1.2404          |
| 1.1991        | 0.1943 | 30   | 1.2378          |
| 1.156         | 0.2073 | 32   | 1.2367          |
| 1.2827        | 0.2202 | 34   | 1.2361          |
| 1.151         | 0.2332 | 36   | 1.2355          |
| 1.178         | 0.2462 | 38   | 1.2349          |
| 1.2604        | 0.2591 | 40   | 1.2337          |
| 1.1988        | 0.2721 | 42   | 1.2322          |
| 1.1819        | 0.2850 | 44   | 1.2307          |
| 1.123         | 0.2980 | 46   | 1.2301          |
| 1.1661        | 0.3109 | 48   | 1.2304          |
| 1.2776        | 0.3239 | 50   | 1.2306          |
| 1.2437        | 0.3368 | 52   | 1.2303          |
| 1.1617        | 0.3498 | 54   | 1.2291          |
| 1.2691        | 0.3628 | 56   | 1.2280          |
| 1.1998        | 0.3757 | 58   | 1.2275          |
| 1.1656        | 0.3887 | 60   | 1.2276          |
| 1.2549        | 0.4016 | 62   | 1.2275          |
| 1.3261        | 0.4146 | 64   | 1.2279          |
| 1.2188        | 0.4275 | 66   | 1.2279          |
| 1.2544        | 0.4405 | 68   | 1.2278          |
| 1.276         | 0.4534 | 70   | 1.2273          |
| 1.1895        | 0.4664 | 72   | 1.2269          |
| 1.2274        | 0.4794 | 74   | 1.2268          |
| 1.1861        | 0.4923 | 76   | 1.2267          |
| 1.262         | 0.5053 | 78   | 1.2265          |
| 1.3122        | 0.5182 | 80   | 1.2265          |
| 1.3043        | 0.5312 | 82   | 1.2266          |
| 1.2069        | 0.5441 | 84   | 1.2266          |
| 1.2088        | 0.5571 | 86   | 1.2266          |
| 1.1754        | 0.5700 | 88   | 1.2265          |
| 1.1704        | 0.5830 | 90   | 1.2266          |
| 1.3636        | 0.5960 | 92   | 1.2265          |


### Framework versions

- PEFT 0.12.0
- Transformers 4.44.2
- Pytorch 2.3.0+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1