|
--- |
|
license: apache-2.0 |
|
--- |
|
<img src="https://huggingface.co/cognitivecomputations/fc-dolphin-2.6-mistral-7b-dpo-laser/resolve/main/fc-dolphin.jpg" width="600" /> |
|
by David, Fernando and Eric |
|
|
|
Sponsored by: [VAGO Solutions](https://vago-solutions.de) and [HyperSpace.Ai](https://hyperspace.computer/) |
|
|
|
Join our Discord! https://discord.gg/cognitivecomputations |
|
|
|
A function calling version of [cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser](https://huggingface.co/cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser) |
|
|
|
It follows the implementation of laserRMT @ https://github.com/cognitivecomputations/laserRMT and the novel training technique - we partially freeze the model according to a laser-like analysis (Official Paper soon) |
|
which effectively prevents the significant problem of language models forgetting previously acquired knowledge. This aspect is particularly crucial when attempting to teach the model specific skills, such as function calling. |
|
|
|
We intend to be the first of a family of experimentations being carried out @ Cognitive Computations. |
|
|
|
# Quants |
|
- [dagbs/-GGUF](https://huggingface.co/dagbs/fc-dolphin-2.6-mistral-7b-dpo-laser-GGUF) |