File size: 2,120 Bytes
8d2c231
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f8f9af4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
---
language: 
  - id
  - en
tags:
  - math
  - education
  - indonesian
  - small-model
  - fine-tuned
license: apache-2.0
datasets:
  - video-transcripts
metrics:
  - accuracy
model-index:
  - name: Gasing Math Teacher
    results:
      - task: 
          name: Mathematical Problem Solving
          type: text-generation
        dataset:
          name: Video Transcripts
          type: generated
        metrics:
          - type: Accuracy
            value: 85%
            mode: approximate
---

# Gasing Math Teacher - Indonesian Math Instruction Model

## Model Description
Gasing Math Teacher is a specialized 0.5B language model fine-tuned for mathematical instruction in Indonesian. The model demonstrates unique capabilities in explaining mathematical concepts using creative "pisahkan" (separation) methods.

### Key Features
- Trained on video transcript data
- Specializes in mathematical problem-solving
- Provides detailed, step-by-step explanations in Indonesian
- Uses innovative teaching methods like finger-counting and concrete number separation

### Training Details
- Base Model: Qwen/Qwen2.5-Coder-0.5B-Instruct
- Training Data: Video transcripts
- Fine-tuning Method: LoRA (Low-Rank Adaptation)
- Number of Layers: 2
- Learning Rate: 1e-5

### Example Capabilities
The model can:
- Solve basic addition problems
- Explain calculations using creative "pisahkan" methods
- Break down mathematical concepts into simple, understandable steps

### Limitations
- Primarily trained for simple mathematical operations
- Best performance in Indonesian language
- May not generalize to complex mathematical problems

### Ethical Considerations
This model is intended for educational purposes and should be used responsibly.

## Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("your-username/gasing-math-teacher")
tokenizer = AutoTokenizer.from_pretrained("your-username/gasing-math-teacher")
```

## License
[Include appropriate license information]

## Citation
If you use this model, please cite:
[Include citation details]