metadata
license: apache-2.0
language:
- it
library_name: peft
pipeline_tag: text-generation
tags:
- legal
base_model: mistralai/Mistral-7B-Instruct-v0.1
Model Description
A Mistral-7B-instruct-v0.1 model to extract a title from the text of Italian law articles. It is fine-tuned over a set of 100k text-title pairs that are available throughout the Italian legislation. It can be used to extract titles for articles or attachments that do not have a pre-defined title.
- Developed by: Andrea Colombo, Politecnico di Milano
- Model type: text generation
- Language(s) (NLP): Italian
- License: Apache 2.0
- Finetuned from model: mistralai/Mistral-7B-Instruct-v0.1
Training Details
Training Procedure
The model has been trained for 100 training steps with batch size 4, 4-bit quantization using bitsandbytes and a LoRA rank of 64. We use the paged Adam optimizer, a learning rate of 0.004, and a cosine learning rate scheduler with a 0.03 warm-up fraction.
Evaluation
The best model reported an evaluation loss of 1.0030452013015747