vincenthuynh commited on
Commit
9954e86
1 Parent(s): bc8830b

update README with code on how to load and use

Browse files
Files changed (1) hide show
  1. README.md +41 -1
README.md CHANGED
@@ -18,7 +18,11 @@ model_type: t5
18
 
19
  # **Text-to-API Command Model**
20
 
21
- This repository contains a fine-tuned T5-Small model trained to convert natural language commands into standardized API commands. The model is designed for use cases where human-written instructions need to be translated into machine-readable commands for home automation systems or other API-driven platforms.
 
 
 
 
22
 
23
  ## **Model Details**
24
 
@@ -73,3 +77,39 @@ This model is designed for:
73
  - Ambiguous or overly complex inputs may produce unexpected outputs.
74
  - Fine-tuning on domain-specific data is recommended for specialized use cases.
75
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18
 
19
  # **Text-to-API Command Model**
20
 
21
+ This repository contains a fine-tuned T5-Small model trained to convert natural language commands into
22
+ standardized API commands. The model is designed for use cases where human-written instructions need
23
+ to be translated into machine-readable commands for home automation systems or other API-driven platforms.
24
+
25
+ ---
26
 
27
  ## **Model Details**
28
 
 
77
  - Ambiguous or overly complex inputs may produce unexpected outputs.
78
  - Fine-tuning on domain-specific data is recommended for specialized use cases.
79
 
80
+
81
+ ## **How to Use the Model**
82
+
83
+ ### **Loading the Model**
84
+
85
+ ### **Step 1: Install Required Libraries**
86
+
87
+ To use this model, first install the required libraries:
88
+
89
+ ```bash
90
+ pip install transformers torch
91
+ ```
92
+
93
+ ### **Step 2: Load and Use the Model**
94
+
95
+ You can use the following Python code to generate API commands from natural language inputs:
96
+
97
+ ```python
98
+ from transformers import T5Tokenizer, T5ForConditionalGeneration
99
+
100
+ # Load the tokenizer and model
101
+ tokenizer = T5Tokenizer.from_pretrained('vincenthuynh/SLM_CS576')
102
+ model = T5ForConditionalGeneration.from_pretrained('vincenthuynh/SLM_CS576')
103
+
104
+ # Function to generate API commands
105
+ def generate_api_command(model, tokenizer, text, device='cpu', max_length=50):
106
+ input_ids = tokenizer.encode(text, return_tensors='pt').to(device)
107
+ with torch.no_grad():
108
+ generated_ids = model.generate(input_ids=input_ids, max_length=max_length, num_beams=5, early_stopping=True)
109
+ return tokenizer.decode(generated_ids[0], skip_special_tokens=True)
110
+
111
+ # Example usage
112
+ command = "Please turn off the kitchen lights"
113
+ api_command = generate_api_command(model, tokenizer, command)
114
+ print(api_command)
115
+