ssmits's picture
Create README.md
0458937 verified
|
raw
history blame
1.01 kB
metadata
base_model:
  - ssmits/Falcon2-5.5B-multilingual
library_name: sentence-transformers
tags:
  - ssmits/Falcon2-5.5B-multilingual
license: apache-2.0
language:
  - es
  - fr
  - de
  - 'no'
  - sv
  - da
  - nl
  - pt
  - pl
  - ro
  - it
  - cs
pipeline_tag: text-classification

Usage

Embeddings version of the base model of ssmits/Falcon2-5.5B-multilingual. The 'lm_head' layer of this model has been removed, which means it can be used for embeddings. It will not perform greatly, as it needs to be further fine-tuned, as shown by intfloat/e5-mistral-7b-instruct. Additionaly, in stead of a normalization layer, the hidden layers are followed up by both a classical weight and bias 1-dimensional array of 4096 values. Further research needs to be conducted if this architecture will fully function when adding a classification head in combination with utilizing the transformers library.