Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Posts
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

Spaces:
yusufs
/
sailor2-3b-chat
Paused

App Files Files Community
Fetching metadata from the HF Docker repository...
sailor2-3b-chat
Ctrl+K
Ctrl+K
  • 1 contributor
History: 61 commits
yusufs's picture
yusufs
feat(sailor2-3b-chat): change readme
0f8187a 16 days ago
  • .gitignore
    19 Bytes
    feat(download_model.py): remove download_model.py during build, it causing big image size 6 months ago
  • Dockerfile
    1.38 kB
    fix(using sail/Sailor2-3B-Chat): sail/Sailor2-3B-Chat about 1 month ago
  • README.md
    1.73 kB
    feat(sailor2-3b-chat): change readme 16 days ago
  • download_model.py
    700 Bytes
    feat(add-model): always download model during build, it will be cached in the consecutive builds 6 months ago
  • main.py
    6.7 kB
    feat(parse): parse output 6 months ago
  • openai_compatible_api_server.py
    24.4 kB
    feat(dep_sizes.txt): removes dep_sizes.txt during build, it not needed 6 months ago
  • poetry.lock
    426 kB
    feat(refactor): move the files to root 6 months ago
  • pyproject.toml
    416 Bytes
    feat(refactor): move the files to root 6 months ago
  • requirements.txt
    9.99 kB
    feat(first-commit): follow examples and tutorials 6 months ago
  • run-llama.sh
    1.51 kB
    fix(runner.sh): --enforce-eager not support values 4 months ago
  • run-sailor.sh
    1.83 kB
    fix(runner.sh): --enforce-eager not support values 4 months ago
  • runner.sh
    2.46 kB
    fix(float16): Bfloat16 is only supported on GPUs with compute capability of at least 8.0. Your Tesla T4 GPU has compute capability 7.5. You can use float16 instead by explicitly setting the`dtype` flag in CLI, for example: --dtype=half. about 1 month ago