--- license: mit language: - la --- # LatinCy w2v models Details coming soon. ## Evaluation The models included here have been evaluated on an analogy task (details coming soon). ### Analogy solved @1 | Model | Accuracy | |--------------------------------------|------------| | Latincy W2V CBOW 100-10 (v0.0.2) | 0.581240 | | Latincy W2V CBOW 100-5 (v0.0.2) | 0.580402 | | **Latincy W2V CBOW 300-10 (v0.0.2)** | **0.628978** | | Latincy W2V CBOW 300-5 (v0.0.2) | 0.623534 | | Latincy W2V CBOW 50-10 (v0.0.2) | 0.437605 | | Latincy W2V CBOW 50-5 (v0.0.2) | 0.436348 | | Latincy W2V SG 100-10 (v0.0.2) | 0.522613 | | Latincy W2V SG 100-5 (v0.0.2) | 0.548995 | | Latincy W2V SG 300-10 (v0.0.2) | 0.472362 | | Latincy W2V SG 300-5 (v0.0.2) | 0.479481 | | Latincy W2V SG 50-10 (v0.0.2) | 0.389028 | | Latincy W2V SG 50-5 (v0.0.2) | 0.414573 | ### Analogy solved @5 | Model | Accuracy | |--------------------------------------|------------| | Latincy W2V CBOW 100-10 (v0.0.2) | 0.781826 | | Latincy W2V CBOW 100-5 (v0.0.2) | 0.798157 | | **Latincy W2V CBOW 300-10 (v0.0.2)** | **0.820771** | | Latincy W2V CBOW 300-5 (v0.0.2) | 0.819933 | | Latincy W2V CBOW 50-10 (v0.0.2) | 0.660385 | | Latincy W2V CBOW 50-5 (v0.0.2) | 0.674623 | | Latincy W2V SG 100-10 (v0.0.2) | 0.749581 | | Latincy W2V SG 100-5 (v0.0.2) | 0.773869 | | Latincy W2V SG 300-10 (v0.0.2) | 0.739531 | | Latincy W2V SG 300-5 (v0.0.2) | 0.745394 | | Latincy W2V SG 50-10 (v0.0.2) | 0.635260 | | Latincy W2V SG 50-5 (v0.0.2) | 0.656198 |