Phi3-Legal-Finetuned
This is a fine-tuned version of the Phi-3 Mini model for legal text generation tasks.
Model Details
- Base Model: Microsoft Phi-3 Mini 128K
- Fine-tuned On: Legal documents and summaries
- Context Length: 128K tokens
- License: MIT
Usage
You can load the model using Hugging Face Transformers:
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "sairamn/Phi3-Legal-Finetuned"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
Limitations
- The model is not a substitute for professional legal advice.
- May generate incorrect or biased information.
Acknowledgments
- Based on Microsoft Phi-3 Mini.
Citation
If you use this model, please cite accordingly.
- Downloads last month
- 24
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for sairamn/Phi3-Legal-Finetuned
Base model
microsoft/Phi-3-mini-128k-instruct