|
--- |
|
base_model: |
|
- euclaise/Memphis-CoT-3B |
|
license: cc-by-4.0 |
|
datasets: |
|
- euclaise/TinyCoT |
|
- euclaise/mathoverflow-accepted |
|
- euclaise/reddit-instruct |
|
- euclaise/WritingPrompts_curated |
|
- sablo/oasst2_curated |
|
- euclaise/mathqa_programs |
|
- BEE-spoke-data/coedit-reworded-deduped |
|
- pszemraj/booksum-short |
|
library_name: transformers |
|
tags: |
|
- supertrainer2000 |
|
--- |
|
|
|
|
|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64137e2150358a805203cbac/wEaKoLeJzidUdTWwQmA6k.png) |
|
|
|
Memphis-scribe 3B is a finetune of [Memphis-CoT 3B](https://huggingface.co/euclaise/Memphis-CoT-3B) on more creative data, which itself is a finetune of [StableLM 3B 4e1t](https://huggingface.co/stabilityai/stablelm-3b-4e1t/). |
|
|
|
|