thinking-merged-3b / README.md
Fischerboot's picture
Update README.md
7b8e0e7 verified
|
raw
history blame contribute delete
No virus
929 Bytes
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge
---
# Holy Fuck
this model was a proof of concept, it has thinking (and other) tags, which made the quality of the output, really f*ckin good.
(Tested Q8 GGUF)
It does really well as a Q8, its fast as fuck boi, and small.
This is just a lora checkpoint, so once the final produt is done, expect something better.
Link to the final product will be here when its done.
# output-model-directory
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* ./3b + ./thinking-3b
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: ./3b+./thinking-3b
merge_method: passthrough
```