|
--- |
|
license: apache-2.0 |
|
tags: |
|
- generated |
|
- text-generation |
|
- conversational |
|
- pytorch |
|
- transformers |
|
- ShareAI |
|
- Felguk |
|
--- |
|
|
|
# <img src="https://huggingface.co/shareAI/Felguk0.5-turbo-preview/resolve/main/hd_e8ecc8aad81eb559a52d229a8d7b0d8a_677b9eaf4d161.png" alt="Felguk0.5-turbo-preview" width="500"/> |
|
|
|
[![Model License](https://img.shields.io/badge/license-Apache%202.0-blue)](LICENSE) |
|
[![Hugging Face Model](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Model%20Hub-orange)](https://huggingface.co/shareAI/Felguk0.5-turbo-preview) |
|
[![Transformers Documentation](https://img.shields.io/badge/📖-Transformers%20Docs-blueviolet)](https://huggingface.co/docs/transformers/index) |
|
|
|
The **Felguk0.5-turbo-preview** model is a preview version of a powerful language model developed by ShareAI. It is designed for text generation, conversational systems, and other NLP tasks. Built on the Transformer architecture, this model is optimized for high performance. |
|
|
|
## All Felguk Models on Hugging Face |
|
|
|
Here’s a list of all available models under the `felguk` namespace on Hugging Face: |
|
|
|
| Model Name | Description | Link | |
|
|-------------------------------------|-----------------------------------------------------------------------------|----------------------------------------------------------------------| |
|
| `shareAI/Felguk0.5-turbo-preview` | A preview version of the Felguk model for text generation and conversation. | [Model Page](https://huggingface.co/shareAI/Felguk0.5-turbo-preview) | |
|
| `shareAI/Felguk0.5-base` | The base version of the Felguk model for general-purpose NLP tasks. | [Model Page](https://huggingface.co/shareAI/Felguk0.5-base) | |
|
| `shareAI/Felguk0.5-large` | A larger version of the Felguk model with enhanced capabilities. | [Model Page](https://huggingface.co/shareAI/Felguk0.5-large) | |
|
| `shareAI/Felguk0.5-multilingual` | A multilingual variant of the Felguk model for cross-language tasks. | [Model Page](https://huggingface.co/shareAI/Felguk0.5-multilingual) | |
|
|
|
> **Note:** Currently, only the **Felguk0.5-turbo-preview** model is available. The other models listed above are planned for future release and are not yet accessible. |
|
|
|
> **Future Plans:** We are excited to announce that **Felguk v1** is in development! This next-generation model will feature improved performance, enhanced multilingual support, and new capabilities for advanced NLP tasks. Stay tuned for updates! |
|
|
|
## Usage |
|
|
|
To use the model with the `transformers` library: |
|
|
|
```python |
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
|
|
# Load the model and tokenizer |
|
model_name = "shareAI/Felguk0.5-turbo-preview" |
|
tokenizer = AutoTokenizer.from_pretrained(model_name) |
|
model = AutoModelForCausalLM.from_pretrained(model_name) |
|
|
|
# Example input |
|
input_text = "Hello! How are you?" |
|
|
|
# Tokenize and generate a response |
|
inputs = tokenizer(input_text, return_tensors="pt") |
|
outputs = model.generate(**inputs, max_length=50) |
|
|
|
# Decode and print the result |
|
response = tokenizer.decode(outputs[0], skip_special_tokens=True) |
|
print(response) |