LWM-Text-Chat-1M-Jax Model Card

Model details

Model type: LWM-Text-Chat-1M-Jax is an open-source model trained from LLaMA-2 on a subset of Books3 filtered data. It is an auto-regressive language model, based on the transformer architecture.

The model is a Jax checkpoint. Inference code and instructions can be found at: https://github.com/LargeWorldModel/lwm

Model date: LWM-Text-Chat-1M-Jax was trained in December 2023.

Paper or resources for more information: https://largeworldmodel.github.io/

License

Llama 2 is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved.

Where to send questions or comments about the model: https://github.com/LargeWorldModel/lwm/issues

Training dataset

  • 800 subset of Books3 documents with 1M plus tokens
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.