🧠 Andy-4-Preview 🧠
Welcome to Andy-4-Preview – a revolutionary new model designed for playing Minecraft via the Mindcraft framework.
This AI is crafted to push the boundaries of gameplay, reasoning, and multi-language summarization, making it one of the most versatile and powerful models available for Minecraft enthusiasts.
Overview
Andy-4-Preview is an 8B parameter model built on the efficient Llama3.1 8B DeepSeek-R1 distill architecture.
It has been meticulously trained over three weeks on a single RTX 3090 using two carefully curated datasets.
The model underwent 2 epochs on the first dataset with a higher learning rate and 4 epochs on the second dataset with a much lower learning rate, ensuring a balanced and robust learning process.
This training regimen, coupled with advanced techniques like manual learning rate adjustments and dynamic dataset modifications, has allowed Andy-4-Preview to outperform its competitors, including GPT-4o-mini, Claude 3.5 Haiku, Mineslayer v2, and the former leader, Andy-3.6.
How to install
On Huggingface, press the Use this model
dropdown menu, and choose Ollama
, then in the drop down menu, choose your quantization, following this GPU VRAM chart:
All of these values assume a context window size of 8192 or less
F16 = 20+ GB
Q8_0 = 12+ GB
Q5_K_M = >8 GB
Q4_K_M = 8+ GB
Q3_K_M = 6-8 GB on Minecraft LOW settings
Q2_K = 6-8 GB on Minecraft LOW settings
If you do not have a dedicated GPU, there is a guide in the Mindcraft discord server for setting up Cloud Computing for free
If you want a more custom install, go to the files
tab on Huggingface, and download the quantization you want and the Modelfile
Once downloaded, open Modelfile
in a text editor, and change the FROM
tag to the exact path of the GGUF weights you installed without the quotes, such as C:\users\jimmy\downloads\Andy-4-preview.Q3_K_M.GGUF
Open the directory of your Modelfile in the terminal, or command prompt, and then run ollama create Andy-4-preview -f Modelfile
, which will create Andy-4-preview
as the model, based on the Modelfile, which includes some important information such as system prompt, chat template, context length, as well as sampling settings.
For most people, using the direct HF method of downloading Andy-4-preview is recommended, but you will suffer from a shorter context window, and possibly sparatic text.
For the full Andy-4 model, there will be a direct download on Ollama, so you won't have to do it this method.
Installation notes
Andy-4-preview, and Andy-4, will support a context window of up to 131072
tokens, but to run the Q5_K_M version with that context length, at stock settings on ollama, you would need an RTX 5090, and you would have 3GB of VRAM left over for Minecraft
A context window of 8192
is smaller, but it can still allow for great conversations with the model, and since the model knows all of the commands by heart, you can cut the command docs from the system prompt to lower context usage.
If you have a 6GB GPU, and want to run it locally, you will need to use the advanced installation method, as well as set the context length in the modelfile to 4096
to allow adequate VRAM for Minecraft.
If you want to know your specific VRAM usage, for Context length as well as the model, follow This Huggingface Repo
If you want to have a larger context window, for less memory, and are okay with some hiccups when it comes to remembering things, follow this guide:
On Windows:
- Close Ollama
- Open System properties by searching for
Edit the system environment variables
in the windows start menu - In the bottom left, click on
Environment Variables...
- Navigate to
System Variables
and pressNew...
- Name the variable
OLLAMA_FLASH_ATTENTION
and set the value to1
- Make another variable, name it
OLLAMA_KV_CACHE_TYPE
and set the value toq8_0
, but if you are okay with more instability, and want more VRAM savings, set it toq4_0
- Press
Okay
in the bottom, and then close out of theSystem Properties
window - Now you can start Ollama again, and have a quantized context window
On Linux / MaxOS
- Open terminal
- Ensure Ollama is not active, as in no language models can be ran
- Run the following command:
export OLLAMA_FLASH_ATTENTION=1
export OLLAMA_KV_CACHE_TYPE="q8_0"
ollama serve
-Or if you want more context, and are okay with instability-
export OLLAMA_FLASH_ATTENTION=1
export OLLAMA_KV_CACHE_TYPE="q4_0"
ollama serve
Key Features
- Revolutionary Performance: Specifically engineered for Minecraft gameplay, providing creative, strategic, and efficient in-game decision-making.
- Advanced Reasoning Capabilities: If you include
Reason in your responses with <think> and </think>
in your prompt, or something similar, Andy-4-Preview will provide detailed reasoning to enhance performance, at the cost of speed. - Multi-Language Summarization: Capable of summarizing content in multiple languages, making it more efficient at remembering it's history.
- Building and Creativity: Not only can it play Minecraft, but it also excels in constructing complex structures and solving intricate in-game challenges.
- Vast Knowledge Base: Possesses extensive knowledge about Minecraft, including game mechanics, strategies, and creative builds.
- Open Source: Completely open source and available to the community for further development and experimentation.
- Versatile Utility: Excels in building, reasoning, summarizing, and strategic gameplay.
Open Source and Licensing
Andy-4-Preview is 100% open source and is licensed under the Apache 2.0 License. We believe in transparency and community collaboration, and all source code and training details are available for review and contribution.
LoRA Weights
Access the LoRA weights for Andy-4-Preview here: Andy-4-Preview LoRA Weights
Datasets
The model was trained on two distinct datasets:
- Dataset 1: Andy-4-Preview-1 (trained for 2 epochs with a higher learning rate)
- Dataset 2: Andy-4-Preview-2 (trained for 4 epochs with a much lower learning rate)
Usage
To integrate Andy-4-Preview with the Mindcraft framework, adjust your configuration settings as needed and follow the execution instructions provided separately.
Customize the personality in the conversing
profile section and configuration files to optimize performance for specific in-game tasks or environments.
Andy-4-Preview is designed to seamlessly integrate with your existing Mindcraft setups, enhancing both gameplay and creative capabilities.
Mindcraft Project
Download and explore the Mindcraft project on GitHub:
Mindcraft Project on GitHub
How to use
On Mindcraft, ensure you put ollama/
before the model name, this can be something like ollama/hf.co/sweaterdog/andy-4-preview:q5_k_m
Notes on Quantization
Typically, the smaller the quantization, such as Q2_K being the lowest, the more the model is to make a mistake, however the Q8_0 weights perform nearly identically to the F16 weights.
For most cases, I suggest Q5_K_M or Q4_K_M, this is for 8GB GPUs, anything less and I would follow the guide on how to use free cloud computing on the discord.
Disclaimer
Important Notice:
Andy-4-Preview is a preview model and, while it represents a significant advancement in AI-driven Minecraft gameplay, please be aware of the following:
- Performance Variability: Due to its experimental nature, the model may not always deliver ultimate performance in every scenario.
- Ongoing Development: This preview release is intended for early testing and community feedback. You might encounter occasional inconsistencies or limitations as further refinements are made.
Community and Contact
Join our vibrant community for discussions, support, and feedback:
- Discord (Mindcraft Server): Join the Mindcraft Discord
- Huggingface Page: Explore more experimental models and projects: Sweaterdog on Huggingface
- Discord Username:
Sweaterdog
We welcome your feedback, suggestions, and contributions as we continue to improve Andy-4-Preview and push the boundaries of AI in gaming.
Acknowledgements
We extend our gratitude to all the developers and researchers who have contributed to the evolution of AI in gaming.
Copyright 2025 Sweaterdog
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
- Downloads last month
- 5
2-bit
3-bit
4-bit
5-bit
8-bit
16-bit
Model tree for Sweaterdog/Andy-4-preview
Base model
deepseek-ai/DeepSeek-R1-Distill-Llama-8B