|
--- |
|
base_model: mini1013/master_domain |
|
library_name: setfit |
|
metrics: |
|
- metric |
|
pipeline_tag: text-classification |
|
tags: |
|
- setfit |
|
- sentence-transformers |
|
- text-classification |
|
- generated_from_setfit_trainer |
|
widget: |
|
- text: 삼성 노트북 NT450R5E K81S K82P K82W K83S K85S 정품 어댑터 아답터 아답타 충전기 AD-6019R 19V 3.16A 뉴 |
|
스마트 전자 |
|
- text: 인트존 205X 노트북 파우치 13인치 15인치 핸디 가방 13인치_스모키블랙 크로니시스템 |
|
- text: 엑토(ACTTO) NBL-04 노트북 도난방지 케이블/(화이트) 국진컴퓨터 |
|
- text: 삼성 정품어댑터AD-4019A/19V2.1A/NT930X5J-K82S/4019P 엔티와이 |
|
- text: LG 그램 17Z90SP & 17ZD90SP 17인치 퓨어 저반사 지문방지 액정보호필름 제트비컴퍼니 |
|
inference: true |
|
model-index: |
|
- name: SetFit with mini1013/master_domain |
|
results: |
|
- task: |
|
type: text-classification |
|
name: Text Classification |
|
dataset: |
|
name: Unknown |
|
type: unknown |
|
split: test |
|
metrics: |
|
- type: metric |
|
value: 0.9272844272844273 |
|
name: Metric |
|
--- |
|
|
|
# SetFit with mini1013/master_domain |
|
|
|
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [mini1013/master_domain](https://huggingface.co/mini1013/master_domain) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. |
|
|
|
The model has been trained using an efficient few-shot learning technique that involves: |
|
|
|
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. |
|
2. Training a classification head with features from the fine-tuned Sentence Transformer. |
|
|
|
## Model Details |
|
|
|
### Model Description |
|
- **Model Type:** SetFit |
|
- **Sentence Transformer body:** [mini1013/master_domain](https://huggingface.co/mini1013/master_domain) |
|
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance |
|
- **Maximum Sequence Length:** 512 tokens |
|
- **Number of Classes:** 9 classes |
|
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) --> |
|
<!-- - **Language:** Unknown --> |
|
<!-- - **License:** Unknown --> |
|
|
|
### Model Sources |
|
|
|
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) |
|
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) |
|
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) |
|
|
|
### Model Labels |
|
| Label | Examples | |
|
|:------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |
|
| 8 | <ul><li>'MSI 프레스티지 16 AI Evo B1MG 노트북 키스킨 커버 무소음 키보드 올유어리브'</li><li>'맥북 에어 15인치 키스킨 M2 실리콘 키보드덮개 (주)스코코'</li><li>'삼성갤럭시북3 Go 키스킨 NT345XPA-KC04S 키스킨 키커버 14인치 실리스킨 문자인쇄 키스킨(블랙) 에이플'</li></ul> | |
|
| 0 | <ul><li>'칼디짓 엘레멘트독 CalDigit Element Dock 썬더볼트4 독 멀티허브 맥북프로 Element Dock (주)디엔에이치'</li><li>'마하링크 2.5인치 SATA 멀티부스트 ML-MBS127 디메이드 (DMADE)'</li><li>'AA-AE2N12B usb 젠더 컴퓨터 인터넷 설치 랜 포트 에스아이'</li></ul> | |
|
| 3 | <ul><li>'잘만 ZM-NS1000 정품/노트북 받침대/쿨링패드 주식회사보성닷컴'</li><li>'-잘만 ZM-NS1 (블랙)- 주식회사 케이에이치몰'</li><li>'잘만 노트북 쿨링 받침대 ZM-NS2000 (주)아싸컴'</li></ul> | |
|
| 5 | <ul><li>'W01 HP Omen 17-ANxxxTX 시리즈용 Crystal액정보호필름 더블유공일'</li><li>'맥북 에어 15인치 필름 M2 무광 하판 외부 1매 무광 상판 1매 (주)스코코'</li><li>'맥북에어 M3 2024 15인치 외부보호필름 3종세트 에이엠스토어'</li></ul> | |
|
| 1 | <ul><li>'이지엘 국산 가벼운 손잡이 노트북 파우치 케이스 13.3인치 For 13.3인치_스모키블랙 이지엘'</li><li>'[에버키] Titan 타이탄 EKP120 18.4인치 비투비마스터'</li><li>'LG 그램 14인치 전용 가죽 파우치 (주) 티앤티정보 용산전자랜드지점'</li></ul> | |
|
| 6 | <ul><li>'[프라임디렉트] 아답터, 220V / 19V 3.42A [내경2.1~2.5mm/외경5.5mm] 전원 케이블 미포함 [비닐포장] (주)컴퓨존'</li><li>'삼성 정품 노트북 NT-RV720 / 19V 3.16A AD-6019S AD-6019R 정품 전원 어댑터 고다'</li><li>'EFM ipTIME 어댑터 48V-0.5A (ipTIME 제품군 호환용) [ 아이피타임 ] (주)클럽라이더'</li></ul> | |
|
| 7 | <ul><li>'HP 노트북배터리 14 15 TPN-Q207 Q208 HT03XL 호환용배터리 라온하람몰'</li><li>'(AA-PB9NC6B)삼성 정품 노트북 배터리/NT-RF410 RF411 RF510 RF511 RF710 RF711 전용 엔티와이'</li><li>'삼성 정품 배터리 AA-PB9NC6B/NT-R530 R540 전용 노트북 배터리/ NTY 엔티와이'</li></ul> | |
|
| 2 | <ul><li>'강원전자 넷메이트 노트북 도난방지 USB포트 와이어 잠금장치 키 타입 NM-SLL05M 보다넷'</li><li>'노트북 도난방지 와이어 잠금장치 NM-SLL03 주식회사 루피하루'</li><li>'엑토(ACTTO) NBL-01 노트북 도난방지 케이블/잠금장치 국진컴퓨터'</li></ul> | |
|
| 4 | <ul><li>'ASUS 비보북 15 X1504ZA 노트북보안필름 프라이버시 사생활보호 거치형 거치형보안필름_1장 한성'</li><li>'[1300K] HP 빅터스 16-SxxxxAN 거치식 양면 사생활보호필터F 엔에이치엔위투 주식회사'</li><li>'삼성전자 갤럭시북4 NT750XGL-XC51S 노트북보안필름 프라이버시 사생활보호 부착형 부착형보안필름_1장 원일'</li></ul> | |
|
|
|
## Evaluation |
|
|
|
### Metrics |
|
| Label | Metric | |
|
|:--------|:-------| |
|
| **all** | 0.9273 | |
|
|
|
## Uses |
|
|
|
### Direct Use for Inference |
|
|
|
First install the SetFit library: |
|
|
|
```bash |
|
pip install setfit |
|
``` |
|
|
|
Then you can load this model and run inference. |
|
|
|
```python |
|
from setfit import SetFitModel |
|
|
|
# Download from the 🤗 Hub |
|
model = SetFitModel.from_pretrained("mini1013/master_cate_el7") |
|
# Run inference |
|
preds = model("엑토(ACTTO) NBL-04 노트북 도난방지 케이블/(화이트) 국진컴퓨터") |
|
``` |
|
|
|
<!-- |
|
### Downstream Use |
|
|
|
*List how someone could finetune this model on their own dataset.* |
|
--> |
|
|
|
<!-- |
|
### Out-of-Scope Use |
|
|
|
*List how the model may foreseeably be misused and address what users ought not to do with the model.* |
|
--> |
|
|
|
<!-- |
|
## Bias, Risks and Limitations |
|
|
|
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* |
|
--> |
|
|
|
<!-- |
|
### Recommendations |
|
|
|
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* |
|
--> |
|
|
|
## Training Details |
|
|
|
### Training Set Metrics |
|
| Training set | Min | Median | Max | |
|
|:-------------|:----|:--------|:----| |
|
| Word count | 4 | 10.3626 | 23 | |
|
|
|
| Label | Training Sample Count | |
|
|:------|:----------------------| |
|
| 0 | 50 | |
|
| 1 | 50 | |
|
| 2 | 50 | |
|
| 3 | 50 | |
|
| 4 | 22 | |
|
| 5 | 50 | |
|
| 6 | 50 | |
|
| 7 | 50 | |
|
| 8 | 50 | |
|
|
|
### Training Hyperparameters |
|
- batch_size: (512, 512) |
|
- num_epochs: (20, 20) |
|
- max_steps: -1 |
|
- sampling_strategy: oversampling |
|
- num_iterations: 40 |
|
- body_learning_rate: (2e-05, 2e-05) |
|
- head_learning_rate: 2e-05 |
|
- loss: CosineSimilarityLoss |
|
- distance_metric: cosine_distance |
|
- margin: 0.25 |
|
- end_to_end: False |
|
- use_amp: False |
|
- warmup_proportion: 0.1 |
|
- seed: 42 |
|
- eval_max_steps: -1 |
|
- load_best_model_at_end: False |
|
|
|
### Training Results |
|
| Epoch | Step | Training Loss | Validation Loss | |
|
|:-------:|:----:|:-------------:|:---------------:| |
|
| 0.0152 | 1 | 0.4966 | - | |
|
| 0.7576 | 50 | 0.184 | - | |
|
| 1.5152 | 100 | 0.037 | - | |
|
| 2.2727 | 150 | 0.0256 | - | |
|
| 3.0303 | 200 | 0.0014 | - | |
|
| 3.7879 | 250 | 0.0002 | - | |
|
| 4.5455 | 300 | 0.0006 | - | |
|
| 5.3030 | 350 | 0.0001 | - | |
|
| 6.0606 | 400 | 0.0001 | - | |
|
| 6.8182 | 450 | 0.0001 | - | |
|
| 7.5758 | 500 | 0.0001 | - | |
|
| 8.3333 | 550 | 0.0001 | - | |
|
| 9.0909 | 600 | 0.0001 | - | |
|
| 9.8485 | 650 | 0.0001 | - | |
|
| 10.6061 | 700 | 0.0001 | - | |
|
| 11.3636 | 750 | 0.0001 | - | |
|
| 12.1212 | 800 | 0.0001 | - | |
|
| 12.8788 | 850 | 0.0001 | - | |
|
| 13.6364 | 900 | 0.0001 | - | |
|
| 14.3939 | 950 | 0.0001 | - | |
|
| 15.1515 | 1000 | 0.0001 | - | |
|
| 15.9091 | 1050 | 0.0001 | - | |
|
| 16.6667 | 1100 | 0.0001 | - | |
|
| 17.4242 | 1150 | 0.0 | - | |
|
| 18.1818 | 1200 | 0.0 | - | |
|
| 18.9394 | 1250 | 0.0 | - | |
|
| 19.6970 | 1300 | 0.0 | - | |
|
|
|
### Framework Versions |
|
- Python: 3.10.12 |
|
- SetFit: 1.1.0.dev0 |
|
- Sentence Transformers: 3.1.1 |
|
- Transformers: 4.46.1 |
|
- PyTorch: 2.4.0+cu121 |
|
- Datasets: 2.20.0 |
|
- Tokenizers: 0.20.0 |
|
|
|
## Citation |
|
|
|
### BibTeX |
|
```bibtex |
|
@article{https://doi.org/10.48550/arxiv.2209.11055, |
|
doi = {10.48550/ARXIV.2209.11055}, |
|
url = {https://arxiv.org/abs/2209.11055}, |
|
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, |
|
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, |
|
title = {Efficient Few-Shot Learning Without Prompts}, |
|
publisher = {arXiv}, |
|
year = {2022}, |
|
copyright = {Creative Commons Attribution 4.0 International} |
|
} |
|
``` |
|
|
|
<!-- |
|
## Glossary |
|
|
|
*Clearly define terms in order to be accessible across audiences.* |
|
--> |
|
|
|
<!-- |
|
## Model Card Authors |
|
|
|
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* |
|
--> |
|
|
|
<!-- |
|
## Model Card Contact |
|
|
|
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* |
|
--> |