File size: 3,679 Bytes
01e74b2
6e9fd0a
01e74b2
 
 
 
 
 
 
 
0c0b737
01e74b2
 
 
 
 
 
3d7859b
01e74b2
 
 
 
 
 
 
920c8e3
d4abe85
01e74b2
 
 
 
 
 
f864b76
01e74b2
 
 
 
 
0c0b737
01e74b2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c9654c1
 
01e74b2
391be02
01e74b2
 
391be02
01e74b2
 
 
 
 
 
 
f864b76
 
4960352
 
 
085e340
 
 
 
c9654c1
 
 
085e340
 
01e74b2
 
8f6e7a9
 
 
 
613a78f
8f6e7a9
613a78f
8f6e7a9
 
613a78f
8f6e7a9
 
613a78f
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
---
license: llama2
language:
- ro
---

# Model Card for Model ID

<!-- Provide a quick summary of what the model is/does. -->

RoLlama2 is a family of pretrained and fine-tuned generative text models for Romanian. This is the repository for the **foundational 7B model**. Links to other models can be found at the bottom of this page.

## Model Details

### Model Description

<!-- Provide a longer summary of what this model is. -->
OpenLLM represents the first open-source effort to build a LLM specialized for Romanian. OpenLLM-Ro developed and publicly releases a collection of Romanian LLMs, both in the form of foundational model and instruct and chat variants.


- **Developed by:** OpenLLM-Ro
<!-- - **Funded by [optional]:** [More Information Needed] -->
<!-- - **Shared by [optional]:** [More Information Needed] -->
<!-- - **Model type:** [More Information Needed] -->
- **Language(s):** Romanian
- **License:** Llama2 Community License Agreement
- **Continual pretrained from model:** [Llama-2-7b](https://huggingface.co/meta-llama/Llama-2-7b-hf)

### Model Sources

<!-- Provide the basic links for the model. -->

- **Repository:** https://github.com/OpenLLM-Ro/llama-recipes
- **Paper:** https://arxiv.org/abs/2406.18266

## Intended Use

### Intended Use Cases

RoLlama2 is intented for research use in Romanian. Base models can be adapted for a variety of natural language tasks while instruction and chat tuned models are intended for assistant-like chat.

### Out-of-Scope Use

<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->

Use in any manner that violates the license, any applicable laws or regluations, use in languages other than Romanian.



## How to Get Started with the Model

Use the code below to get started with the model.

```python
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("OpenLLM-Ro/RoLlama2-7b-Base")
model = AutoModelForCausalLM.from_pretrained("OpenLLM-Ro/RoLlama2-7b-Base")

input_text = "Mihai Eminescu a fost "
input_ids = tokenizer(input_text, return_tensors="pt")

outputs = model.generate(**input_ids, max_new_tokens=100)
print(tokenizer.decode(outputs[0]))
```

## Benchmarks

| Model              | Average  | ARC      | MMLU     |Winogrande|HellaSwag | GSM8k    |TruthfulQA|
|--------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| Llama-2-7b         | 37.11    | 36.09    | **33.67**    | 57.60    | 48.00    | **5.08**     | 42.23    |  
| *RoLlama2-7b-Base* | ***38.03***  | ***37.95***  | *27.22*  | ***59.29***  | ***57.22***  | *2.53*   | ***44.00***  |  



## RoLlama2 Model Family

| Model              | Link  |
|--------------------|:--------:|
|*RoLlama2-7b-Base* | [link](https://huggingface.co/OpenLLM-Ro/RoLlama2-7b-Base)    |
|RoLlama2-7b-Instruct| [link](https://huggingface.co/OpenLLM-Ro/RoLlama2-7b-Instruct) |
|RoLlama2-7b-Chat    | [link](https://huggingface.co/OpenLLM-Ro/RoLlama2-7b-Chat) |


## Citation 

```
@misc{masala2024vorbecstiromanecsterecipetrain,
      title={"Vorbe\c{s}ti Rom\^ane\c{s}te?" A Recipe to Train Powerful Romanian LLMs with English Instructions}, 
      author={Mihai Masala and Denis C. Ilie-Ablachim and Alexandru Dima and Dragos Corlatescu and Miruna Zavelca and Ovio Olaru and Simina Terian-Dan and Andrei Terian-Dan and Marius Leordeanu and Horia Velicu and Marius Popescu and Mihai Dascalu and Traian Rebedea},
      year={2024},
      eprint={2406.18266},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2406.18266}, 
}
```
<!-- **APA:**

[More Information Needed] -->