metadata
language:
- en
library_name: transformers
license: other
Model Card for ContractAssist model
Intruction tuned model using FlanT5-XXL on data generated via ChatGPT for generating and/or modifying the Legal Clauses.
Model Details
Model Description
Developed by: Jaykumar Kasundra, Shreyans Dhankhar
Model type: Language model
Language(s) (NLP): en
License: other
Resources for more information:
Uses
Running the model on a GPU using different precisions
FP16
Click to expand
# pip install accelerate peft bitsandbytes
import torch
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
from peft import PeftModel,PeftConfig
tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xxl")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xxl", device_map="auto", torch_dtype=torch.float16)
input_text = "translate English to German: How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
INT8
Click to expand
# pip install bitsandbytes accelerate
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-xxl")
model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-xxl", device_map="auto", load_in_8bit=True)
input_text = "translate English to German: How old are you?"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
Direct Use
The model can directly be used to generate/modify legal clauses and help assist in drafting contracts. It likely works best on english language.
Compute Infrastructure
Amazon SageMaker Training Job.
Hardware
1 x 24GB NVIDIA A10G
Software
Transformers, PEFT, BitsandBytes
Citation
BibTeX:
Model Card Authors
Jaykumar Kasundra, Shreyans Dhankhar