File size: 1,323 Bytes
e12d1a6
 
a6d4417
 
 
 
 
 
e12d1a6
 
 
 
7ac904c
e12d1a6
929fd3b
e12d1a6
929fd3b
e12d1a6
929fd3b
e12d1a6
 
 
929fd3b
e12d1a6
 
 
 
929fd3b
e12d1a6
 
 
 
 
 
 
929fd3b
e12d1a6
 
97784f8
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
---
library_name: transformers
license: mit
language:
- ko
base_model:
- google/gemma-2-2b-it
pipeline_tag: text-generation
---

# Model Card for Model ID

Gemma2 2b ํ•œ๊ตญ์–ด ๋ฐฉ์–ธ ํ†ต์—ญ๊ธฐ v0.2.0

## Model Description

Gemma2 2b ํ•œ๊ตญ์–ด ๋ฐฉ์–ธ ํ†ต์—ญ๊ธฐ๋Š” ํ•œ๊ตญ์–ด ์‚ฌํˆฌ๋ฆฌ๋ฅผ ํ‘œ์ค€์–ด๋กœ ๋ฒˆ์—ญํ•˜๊ฑฐ๋‚˜ ํ‘œ์ค€์–ด๋ฅผ ํ•œ๊ตญ์–ด ์‚ฌํˆฌ๋ฆฌ๋กœ ๋ณ€ํ™˜ํ•˜๋Š” ํ”„๋กœ์ ํŠธ์˜ ์ผํ™˜์œผ๋กœ ๊ฐœ๋ฐœ๋œ ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค. 

ํ•ด๋‹น ๋ชจ๋ธ์€ Gemma2 2b it ๋ชจ๋ธ์„ QLoRa ๊ธฐ๋ฒ•์œผ๋กœ ํŒŒ์ธํŠœ๋‹ํ•˜์—ฌ ์ œ์ž‘ํ•˜์˜€์Šต๋‹ˆ๋‹ค.

## Uses

์ด ๋ชจ๋ธ์€ ํ•œ๊ตญ์–ด ๋ฐฉ์–ธ์„ ํ‘œ์ค€ ํ•œ๊ตญ์–ด๋กœ ๋ฒˆ์—ญํ•˜๊ฑฐ๋‚˜ ๊ทธ ๋ฐ˜๋Œ€๋กœ ๋ฒˆ์—ญํ•˜๋Š” ๋ฐ ์ง์ ‘ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์Œ์„ฑ ์ธ์‹ ๋ฐ ๋ฒˆ์—ญ ๋„๊ตฌ๋ฅผ ๊ฐœ๋ฐœํ•˜๋Š” ๊ต์œก์ž, ์–ธ์–ดํ•™์ž, ๊ธฐ์ˆ  ๊ฐœ๋ฐœ์ž์—๊ฒŒ ์œ ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.


## Bias, Risks, and Limitations

์ด ๋ชจ๋ธ์€ ์ œ์ฃผ ๋ฐฉ์–ธ์— ์ดˆ์ ์„ ๋งž์ถ˜ ํŠน์ • ๋ฐ์ดํ„ฐ ์„ธํŠธ์— ๋งž์ถฐ ๋ฏธ์„ธ ์กฐ์ •๋˜์—ˆ๊ธฐ ๋•Œ๋ฌธ์— ๋‹ค๋ฅธ ๋ฐฉ์–ธ์ด๋‚˜ ์–ธ์–ด์— ๋Œ€ํ•œ ์„ฑ๋Šฅ์ด ์ œํ•œ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

## How to Get Started with the Model

Use the code below to get started with the model.

### Training Data

[AI_HUB ์ค‘ยท๋…ธ๋…„์ธต ํ•œ๊ตญ์–ด ๋ฐฉ์–ธ ๋ฐ์ดํ„ฐ (์ถฉ์ฒญ๋„, ์ „๋ผ๋„, ์ œ์ฃผ๋„)](https://aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&aihubDataSe=data&dataSetSn=71558)


## TODO