Edit model card

๋กœ๋งจ์Šค ์Šค์บ  ๋Œ€ํ™”์™€, ์ผ์ƒ ๋Œ€ํ™”๋ฅผ ํ™œ์šฉํ•˜์—ฌ ์ง€๋„ํ•™์Šตํ•œ ์Šค์บ  ๋Œ€ํ™” ๊ฐ์ง€ AI ์ž…๋‹ˆ๋‹ค.
๋ชจ๋ธ ํ•™์Šต์— ์‚ฌ์šฉ๋œ ๋ฐ์ดํ„ฐ๋Š” ์ „๋ถ€ ์ง์ ‘ ๊ณต์ˆ˜ํ•œ ๊ฒƒ์ด๋ฉฐ, ํ”ผํ•ด์ž…์€ ์ง€์ธ, ์‚ฌ๊ฑด์„ ๋‹ค๋ฃฌ ์œ ํŠœ๋ธŒ ๊ธฐ์‚ฌ ๋“ฑ์„ ํ†ตํ•ด ๋Œ€ํ™” ๋‚ด์šฉ์„ ์ž…์ˆ˜ํ–ˆ์Šต๋‹ˆ๋‹ค.
11์›” ์ดˆ์— ์˜ฌ๋ผ์˜ฌ ์ƒˆ๋กœ์šด ๋ฒ„์ ผ์˜ AI๋Š” ๋ฐ์ดํ„ฐ๋ฅผ ์–ป๊ธฐ ์œ„ํ•ด ์ง์ ‘ ๋กœ๋งจ์Šค ์Šค์บ , ๋ชธ์บ ํ”ผ์‹ฑ์„ ๋‹นํ•˜๋ฉด์„œ ์–ป์€ ๋ฐ์ดํ„ฐ๊ฐ€ ์ถ”๊ฐ€๋กœ ํ•™์Šต๋œ ๋ชจ๋ธ์ด ์˜ฌ๋ผ์˜ฌ ๊ฒƒ ์ž…๋‹ˆ๋‹ค.

์ž…๋ ฅ์œผ๋กœ ํ…์ŠคํŠธ ๋ฐ์ดํ„ฐ๋ฅผ ์ง‘์–ด๋„ฃ์œผ๋ฉด, ๋ฐ˜ํ™˜์œผ๋กœ 0, 1 ์ค‘ ํ•˜๋‚˜๋ฅผ ๋งŒํ™˜ํ•ฉ๋‹ˆ๋‹ค.
0์ด ์ผ๋ฐ˜๋Œ€ํ™”, 1์ด ๋กœ๋งจ์Šค ์Šค์บ  ๋Œ€ํ™” ์ž…๋‹ˆ๋‹ค.
๋‹ค์Œ๊ณผ ๊ฐ™์€ ์ฝ”๋“œ๋กœ ์‚ฌ์šฉ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

from transformers import BertTokenizer, BertForSequenceClassification
import torch

# BERT ๋ชจ๋ธ๊ณผ ํ† ํฌ๋‚˜์ด์ € ๋ถˆ๋Ÿฌ์˜ค๊ธฐ
model_name = "gihakkk/bert_scam_classifier"
tokenizer = BertTokenizer.from_pretrained(model_name)
model = BertForSequenceClassification.from_pretrained(model_name)

def predict_scam(text):
    inputs = tokenizer(text, return_tensors='pt', truncation=True, padding=True, max_length=128)
    with torch.no_grad():
        outputs = model(**inputs)
    prediction = torch.argmax(outputs.logits, dim=1).item()
    return "์ •์ƒ" if prediction == 1 else "์‚ฌ๊ธฐ"

# ์˜ˆ์‹œ ์‹คํ–‰
text = "์—ฌ๊ธฐ์— ๊ธ€์„ ์ ์–ด์ฃผ์„ธ์š”"
result = predict_scam(text)
print(f"์˜ˆ์ธก ๊ฒฐ๊ณผ: {result}")
Downloads last month
50
Safetensors
Model size
178M params
Tensor type
F32
ยท
Inference API
Unable to determine this model's library. Check the docs .