Datasets:
ArXiv:
License:
OscarDo93589
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -160,7 +160,7 @@ The annotation data is stored in
|
|
160 |
|
161 |
### Best practice
|
162 |
|
163 |
-
During the training of **OS-Atlas-
|
164 |
|
165 |
Below is an example of a data entry:
|
166 |
|
@@ -179,6 +179,23 @@ Below is an example of a data entry:
|
|
179 |
}
|
180 |
```
|
181 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
182 |
The prompts we used are stored in `prompts.json`.
|
183 |
|
184 |
***
|
|
|
160 |
|
161 |
### Best practice
|
162 |
|
163 |
+
During the training of **OS-Atlas-4B**, we randomly sampled predefined prompts to wrap the grounding data. Additionally, we scaled the relative coordinates of each element (in the range [0, 1]) by multiplying them by 1000 before inputting them into the model for training.
|
164 |
|
165 |
Below is an example of a data entry:
|
166 |
|
|
|
179 |
}
|
180 |
```
|
181 |
|
182 |
+
**OS-Atlas-7B**
|
183 |
+
|
184 |
+
```
|
185 |
+
{
|
186 |
+
"conversations": [
|
187 |
+
{
|
188 |
+
"from": "human",
|
189 |
+
"value": "<image>\nUsing the provided screenshot, I'll describe webpage elements for you to locate (with bbox).\n<|object_ref_start|>Facebook<|object_ref_end|>\n<|object_ref_start|>Subscribe<|object_ref_end|>\n<|object_ref_start|>Twitter<|object_ref_end|>\n<|object_ref_start|>Read More<|object_ref_end|>\n<|object_ref_start|>Read More<|object_ref_end|>"
|
190 |
+
},
|
191 |
+
{
|
192 |
+
"from": "gpt",
|
193 |
+
"value": "<|object_ref_start|>Facebook<|object_ref_end|><|box_start|>(4,955),(36,970)<|box_end|>\n<|object_ref_start|>Subscribe<|object_ref_end|><|box_start|>(4,913),(43,932)<|box_end|>\n<|object_ref_start|>Twitter<|object_ref_end|><|box_start|>(39,955),(62,970)<|box_end|>\n<|object_ref_start|>Read More<|object_ref_end|><|box_start|>(30,138),(73,157)<|box_end|>\n<|object_ref_start|>Read More<|object_ref_end|><|box_start|>(30,139),(73,155)<|box_end|>"
|
194 |
+
}
|
195 |
+
]
|
196 |
+
}
|
197 |
+
```
|
198 |
+
|
199 |
The prompts we used are stored in `prompts.json`.
|
200 |
|
201 |
***
|