File size: 1,009 Bytes
0458937
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
---
base_model:
- ssmits/Falcon2-5.5B-multilingual
library_name: sentence-transformers
tags:
- ssmits/Falcon2-5.5B-multilingual
license: apache-2.0
language:
- es
- fr
- de
- 'no'
- sv
- da
- nl
- pt
- pl
- ro
- it
- cs
pipeline_tag: text-classification
---

## Usage
Embeddings version of the base model of [ssmits/Falcon2-5.5B-multilingual](https://huggingface.co/ssmits/Falcon2-5.5B-multilingual/edit/main/README.md).
The 'lm_head' layer of this model has been removed, which means it can be used for embeddings. It will not perform greatly, as it needs to be further fine-tuned, as shown by [intfloat/e5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct).
Additionaly, in stead of a normalization layer, the hidden layers are followed up by both a classical weight and bias 1-dimensional array of 4096 values.
Further research needs to be conducted if this architecture will fully function when adding a classification head in combination with utilizing the transformers library.