Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
liminerity
/
Memgpt-slerp-DPO
like
0
Follow
liminerity
13
Text Generation
Transformers
Safetensors
mistral
Merge
mergekit
lazymergekit
starsnatched/MemGPT-DPO-2
starsnatched/MemGPT-DPO
conversational
text-generation-inference
Inference Endpoints
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
777415d
Memgpt-slerp-DPO
Commit History
Upload folder using huggingface_hub
777415d
verified
limin(gate)
commited on
Jan 28, 2024
initial commit
25e7ea5
verified
limin(gate)
commited on
Jan 28, 2024