metadata
base_model: unsloth/Qwen2.5-7B-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
license: apache-2.0
language:
- en
Uploaded model
- Developed by: Sweaterdog
- License: apache-2.0
- Finetuned from model : unsloth/Qwen2.5-7B-bnb-4bit
The MindCraft LLM tuning CSV file can be found here, this can be tweaked as needed. MindCraft-LLM
What is the Purpose?
This model is built and designed to play Minecraft via the extension named "MindCraft" Which allows language models, like the ones provided in the files section, to play Minecraft.
- Why a new model?
- What kind of Dataset was used?
- Why choose Qwen2.5 for the base model?
Here is the link to the Google Colab notebook for fine tuning your own model, in case you want to use a different one, such as Llama-3-8b, or if you want to change the hyperparameters Google Colab
This qwen2 model was trained 2x faster with Unsloth and Huggingface's TRL library.