nano nextgpt Weights

This repository contains the weights for nano nextgpt, a minimalist re-implementation of NextGPT in the style of Andrej Karpathy's nanoGPT. The project is based on the NextGPT architecture, which is detailed here: NextGPT.

Repository for nano nextgpt

You can find the main repository and source code for nano nextgpt here: nano nextgpt GitHub Repository.

About nano nextgpt

nano nextgpt is a stripped-down version of NextGPT, focusing solely on image and text processing, omitting the video and audio processing capabilities. The model underwent two primary stages of training:

  1. Linear Layer Training: This involved mapping ImageBind embeddings onto the LLM (Large Language Model) embedding space. The training dataset comprised 20,000 image-text pairs sourced from COCO 3m.

  2. Instruction Tuning: This stage involved training the entire model, including the linear layer and LLM, end-to-end. This was done using qlora and peft techniques, with a dataset containing 80,000 image-text pairs in a conversational format, taken from the Llava project.

Usage

For detailed usage instructions, including how to integrate these weights into your applications, please refer to the nano next gpt GitHub repository.


Please note that this README is for the weights of the nano nextgpt model. For more information on the model architecture, training procedures, or any other inquiries, refer to the main nano nextgpt repository linked above.

Downloads last month
11
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.