File size: 7,036 Bytes
842db00
 
 
 
 
 
 
64169ae
842db00
 
 
 
 
 
 
 
 
c39de03
 
 
 
 
842db00
 
49ff387
842db00
 
 
 
49ff387
842db00
 
 
 
 
 
d2ff540
842db00
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
49ff387
842db00
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d2ff540
842db00
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
49ff387
 
842db00
 
 
49ff387
842db00
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
64169ae
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
---
license: apache-2.0
datasets:
- smirki/UI_REASONING_v1.01
language:
- en
base_model:
- smirki/UIGEN-T1.1-Qwen-14B
tags:
- code
- ui
- generation
- uigen
---

![image/png](https://cdn-uploads.huggingface.co/production/uploads/64d1129297ca59bcf7458d07/VSplF7AM1PJPzeR9FlDhE.png)

# **Model Card for UIGEN-T1.1**

New and Improved reasoning traces. Better ui generation. Smarter decisions. Better code generation! Trained on a 700+ dataset. 
USE BUDGET FORCING (putting the word answer or think at the end of the assistant generation to keep generationg more thinking and use 'answer' to write code.)  
SFT on a 4090 for 4 hours.

## **Model Summary**  
UIGEN-T1.1 is a **14-billion parameter transformer model** fine-tuned on **Qwen2.5-Coder-14B-Instruct**. It is designed for **reasoning-based UI generation**, leveraging a complex chain-of-thought approach to produce **robust HTML and CSS-based UI components**. Currently, it is limited to **basic applications such as dashboards, landing pages, and sign-up forms**.  

## **Model Details**  

### **Model Description**  
UIGEN-T1.1 generates **HTML and CSS-based UI layouts** by reasoning through design principles. While it has a strong **chain-of-thought reasoning process**, it is currently **limited to text-based UI elements and simpler frontend applications**. The model **excels at dashboards, landing pages, and sign-up forms**, but **lacks advanced interactivity** (e.g., JavaScript-heavy functionalities).  

- **Developed by:** [smirki](https://huggingface.co/smirki)  
- **Shared by:** [smirki](https://huggingface.co/smirki)  
- **Model type:** Transformer-based  
- **Language(s) (NLP):** English  
- **License:** Apache 2.0  
- **Finetuned from model:** Qwen2.5-Coder-14B-Instruct  

### **Model Sources**  
- **Repository:** (Will be uploaded to GitHub soon)  
- **Hosted on:** [Hugging Face](https://huggingface.co/smirki)  
- **Demo:** Coming soon

![image/png](https://cdn-uploads.huggingface.co/production/uploads/64d1129297ca59bcf7458d07/617YL3OlJHflvR63qbA27.png)


## **Uses**  

### **Direct Use**  
- Generates HTML and CSS code for **basic UI elements**  
- Best suited for **dashboards, landing pages, and sign-up forms**  
- Requires **manual post-processing** to refine UI outputs  
- **May require using the word "answer" at the end of the input prompt** to get better inference  

### **Downstream Use (optional)**  
- Can be fine-tuned further for **specific frontend frameworks (React, Vue, etc.)**  
- May be integrated into **no-code/low-code UI generation tools**  

### **Out-of-Scope Use**  
- Not suitable for **complex frontend applications** involving JavaScript-heavy interactions  
- May not generate **fully production-ready** UI code  
- **Limited design variety** – biased towards **basic frontend layouts**  

## **Bias, Risks, and Limitations**  

### **Biases**  
- **Strong bias towards basic frontend design patterns** (may not generate creative or advanced UI layouts)  
- **May produce repetitive designs** due to limited training scope  

### **Limitations**  
- **Artifacting issues**: Some outputs may contain formatting artifacts  
- **Limited generalization**: Performs best in **HTML + CSS UI generation**, but **not robust for complex app logic**  
- **May require prompt engineering** (e.g., adding "answer" to input for better results)  

## **How to Get Started with the Model**  

### **Example Model Template**  
```plaintext
<|im_start|>user
{question}<|im_end|>
<|im_start|>assistant
<|im_start|>think
{reasoning}<|im_end|>
<|im_start|>answer
```

### **Basic Inference Code**  
```python
from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "smirki/UIGEN-T1.1-14B"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name).to("cuda")

prompt = """<|im_start|>user
Make a dark-themed dashboard for an oil rig.<|im_end|>
<|im_start|>assistant
<|im_start|>think
"""

inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
outputs = model.generate(**inputs, max_new_tokens=12012, do_sample=True, temperature=0.7) #max tokens has to be greater than 12k

print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```

## **Training Details**  

### **Training Data**  
- **Based on:** Qwen2.5-Coder-14B-Instruct  
- **Fine-tuned on:** UI-related datasets with reasoning-based HTML/CSS examples  

### **Training Procedure**  
- **Preprocessing:** Standard text-tokenization using Hugging Face transformers  
- **Training Precision:** **bf16 mixed precision**  quantized to q8

## **Evaluation**  

### **Testing Data, Factors & Metrics**  
- **Testing Data:** Internal UI design-related datasets  
- **Evaluation Factors:** Bias towards basic UI components, robustness in reasoning, output quality  
- **Metrics:** Subjective evaluation based on UI structure, correctness, and usability  

### **Results**  
- **Strengths:**  
  - **Good at reasoning-based UI layouts**  
  - **Generates structured and valid HTML/CSS**  
- **Weaknesses:**  
  - **Limited design diversity**  
  - **Artifacting in outputs**  

## **Technical Specifications**  

### **Model Architecture and Objective**  
- **Architecture:** Transformer-based LLM fine-tuned for UI reasoning  
- **Objective:** Generate **robust frontend UI layouts with chain-of-thought reasoning**  

### **Compute Infrastructure**  
- **Hardware Requirements:** 12GB VRAM reccomended
- **Software Requirements:**  
  - Transformers library (Hugging Face)  
  - PyTorch  

## **Citation**  
If using this model, please cite:  

**BibTeX:**  
```bibtex
@misc{smirki_UIGEN-T1.1,
  title={UIGEN-T1.1.1: Chain-of-Thought UI Generation Model},
  author={smirki},
  year={2025},
  publisher={Hugging Face},
  url={https://huggingface.co/smirki/UIGEN-T1.11}
}
```

## **More Information**  
- **GitHub Repository:** (Coming soon)  
- **Web Demo:** (Coming soon)  

## **Model Card Authors**  
- **Author:** smirki  

## **Model Card Contact**  
- **Contact:** [smirki on Hugging Face](https://huggingface.co/smirki)


![image/png](https://cdn-uploads.huggingface.co/production/uploads/64d1129297ca59bcf7458d07/VLs2LyOPXV4GZ2feXpD7F.png)


![image/png](https://cdn-uploads.huggingface.co/production/uploads/64d1129297ca59bcf7458d07/boAgm__7mbD_B37OZzzyw.png)


![image/png](https://cdn-uploads.huggingface.co/production/uploads/64d1129297ca59bcf7458d07/aBn-uzBsmK7vj-CxXRyBi.png)


![image/png](https://cdn-uploads.huggingface.co/production/uploads/64d1129297ca59bcf7458d07/HaPat6448BpqneaBU47bz.png)


![image/png](https://cdn-uploads.huggingface.co/production/uploads/64d1129297ca59bcf7458d07/8VuOPWMSJlu3kxmUQJb6R.png)


![image/png](https://cdn-uploads.huggingface.co/production/uploads/64d1129297ca59bcf7458d07/RgFpQigIYes7wvzulZnkg.png)


![image/png](https://cdn-uploads.huggingface.co/production/uploads/64d1129297ca59bcf7458d07/alBaCYJSLKyomF55XjOv-.png)


![image/png](https://cdn-uploads.huggingface.co/production/uploads/64d1129297ca59bcf7458d07/k3uu2IPU_wIWco45RwdOV.png)

---