File size: 2,168 Bytes
c7009d8
 
269aabb
 
 
 
 
c7009d8
269aabb
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
---
license: cc-by-nc-nd-4.0
language:
- en
- de
- ko
library_name: transformers
---
# πŸŒžπŸš€ SOLAR-polyglot-4x10.7 

Multilingual experiment based on my mixtral collection [Polyglot](https://huggingface.co/collections/macadeliccc/polyglot-65a2027a90b5e87bcdaa5e12)

![solar](solar-polyglot.png)

The model is proficient in:
  + English
  + German
  + Korean

## πŸŒ… Code Example

Example with evaluation script also available in [colab](https://colab.research.google.com/drive/10FWCLODU_EFclVOFOlxNYMmSiLilGMBZ?usp=sharing)

```python
from transformers import AutoModelForCausalLM, AutoTokenizer

def generate_response(prompt):
    """
    Generate a response from the model based on the input prompt.

    Args:
    prompt (str): Prompt for the model.

    Returns:
    str: The generated response from the model.
    """
    # Tokenize the input prompt
    inputs = tokenizer(prompt, return_tensors="pt")
    
    # Generate output tokens
    outputs = model.generate(**inputs, max_new_tokens=512, eos_token_id=tokenizer.eos_token_id, pad_token_id=tokenizer.pad_token_id)

    # Decode the generated tokens to a string
    response = tokenizer.decode(outputs[0], skip_special_tokens=True)

    return response


# Load the model and tokenizer
model_id = "macadeliccc/SOLAR-math-2x10.7b"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id, load_in_4bit=True)

prompt = "Explain the proof of Fermat's Last Theorem and its implications in number theory."


print("Response:")
print(generate_response(prompt), "\n")
```


## Evaluations 

TODO


### πŸ“š Citations 

```bibtex
@misc{kim2023solar,
      title={SOLAR 10.7B: Scaling Large Language Models with Simple yet Effective Depth Up-Scaling}, 
      author={Dahyun Kim and Chanjun Park and Sanghoon Kim and Wonsung Lee and Wonho Song and Yunsu Kim and Hyeonwoo Kim and Yungi Kim and Hyeonju Lee and Jihoo Kim and Changbae Ahn and Seonghoon Yang and Sukyung Lee and Hyunbyung Park and Gyoungjin Gim and Mikyoung Cha and Hwalsuk Lee and Sunghun Kim},
      year={2023},
      eprint={2312.15166},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
```