File size: 3,059 Bytes
73741b5 0064951 8ba089c 73741b5 8ba089c 73741b5 8ba089c 73741b5 8ba089c 73741b5 8ba089c 73741b5 8ba089c 73741b5 8ba089c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 |
---
library_name: transformers
tags:
- text-generation
- text-generation-inference
- Inference Endpoints
license: mit
datasets:
- omeryentur/text-to-postgresql
language:
- en
metrics:
- rouge
pipeline_tag: text2text-generation
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This model is fine-trained from the google/flan-t5-base model to achieve better accuracy on generating SQL Queries.
It has been trained to generate sql queries given a question and database schema(s).
It can be used in any of such applications where sql queries are needed (particularly Postgres queries).
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** Oghenerunor Adjekpiyede
- **Model type:** Text2TextGeneration
- **Language(s) (NLP):** English
- **License:** MIT
- **Finetuned from model [optional]:** google/flan-t5-base
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://huggingface.co/kampkelly/sql-generator
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
This model is to be used and performs well for generating SQL queries. This model for other tasks may not give satisfactory performance on generating text in other general use cases.
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
Use with transformers
```
from peft import PeftModel
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
model_base = AutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-base", torch_dtype=torch.bfloat16, trust_remote_code=True)
model = PeftModel.from_pretrained(model_base,
peft_model_path,
torch_dtype=torch.bfloat16,
is_trainable=False)
input_ids = tokenizer(prompt, padding="max_length", max_length=300, truncation=True, return_tensors="pt").input_ids
model_output = model.generate(input_ids=input_ids, max_new_tokens = 300, use_cache = True,
num_beams=3,
do_sample=True,
top_k=50,
top_p=0.75,
temperature=0.1,
early_stopping=True
)
model_text_output = tokenizer.decode(model_output[0], skip_special_tokens=True)
print(model_text)
```
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
This model is particularly good for generating SQL `Select` statement queries. Other types of query statements such as Create, Delete, Update, etc are not fully supported.
|