Null-GPT2-Large

Description

This is a GPT2-LARGE Model, but only with the architecture, no pre-trained weights, biases, attention, etc.

This is useful for researchers who want to play with training the model (not tuning).

Generated via the github repo Model Architecture Generator

Use

First go into the directory of the model,

git clone https://github.com/ivanhe123/Model-Architecture-Generator
python -m randomnize_params -in "./Null-GPT2-Large" -out path_model_out

path_model_out is just the output path of the newly randomnized model.

Downloads last month
9
Safetensors
Model size
774M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for Inoob/Null-GPT2-Large

Finetuned
(62)
this model