Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Uranus
/
shellchat-v1-safety-sft-dpo
like
0
Safetensors
codeshell
custom_code
Model card
Files
Files and versions
Community
main
shellchat-v1-safety-sft-dpo
/
added_tokens.json
dingyaoyu
first stage sft and second stage dpo
a0590aa
2 months ago
raw
Copy download link
history
blame
contribute
delete
Safe
30 Bytes
{
"<|endoftoken|>"
:
70019
}