You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Llama-2-7b-chat-finetune-tourism

Model description

This model is a fine-tuned version of the Llama-2-7b-chat model. It has been fine-tuned on a custom dataset focused on Moroccan tourism, aiming to improve the model's ability to generate responses related to Moroccan culture, tourism, and related topics. The fine-tuning was performed using a set of dialogue-based data formatted for the Llama-2 model.

Intended use

This model is designed to assist in generating responses for tasks related to Moroccan tourism. It can be used in various applications such as:

  • Answering questions related to traveling in Morocco.
  • Providing suggestions for things to do, places to visit, and cultural insights.
  • Engaging in conversational AI applications focusing on Moroccan tourism.

How to use

You can use this model for text generation via the Hugging Face transformers library. Here’s an example of how to load and use it:

from transformers import pipeline

# Load the fine-tuned model
model_name = "Ouiam123/Llama-2-7b-chat-finetune-tourism"
pipe = pipeline(task="text-generation", model=model_name)

# Generate a response
response = pipe("What should I know about eating with my hands in Morocco?")
print(response[0]['generated_text'])

This code will return a relevant response based on the input query related to Moroccan tourism.

Training data

This model was fine-tuned on the Nouhayla-CH/moroccantourismeds-llama2-formatted dataset. The dataset includes a range of conversational data tailored to tourism-related topics in Morocco. It contains instructions and examples focused on cultural insights, travel advice, and local customs.

Data format:

  • The dataset was formatted for the Llama-2 model and contains instruction-based dialogues for better interactive conversation with the model.
  • The training data was curated and processed to ensure the model could generate meaningful and context-aware answers.

Model architecture

This model is based on the Llama-2-7b architecture, which is a large-scale transformer model optimized for generating coherent and contextually accurate text. The Llama-2 model is designed to handle a variety of tasks, such as question answering, text completion, and dialogue generation.

Key Specifications:

  • Base Model: Llama-2-7b
  • Fine-tuned for: Tourism-related dialogue in Morocco
  • Model Size: 7 billion parameters
  • Architecture: Transformer-based language model

Model performance

The model has been trained and fine-tuned to generate high-quality text in response to prompts related to Moroccan tourism. It performs well in generating conversational responses, offering useful insights, and providing practical advice for travelers.

Examples:

  • Input: "What should I know about eating with my hands in Morocco?" Output: "In Morocco, eating with your hands is common, especially for traditional dishes like couscous or tagine. It's considered polite to use your right hand, as the left hand is reserved for other purposes. Be sure to wash your hands before eating, and if you're unsure about anything, it's fine to ask the host for guidance!"

  • Input: "Can you tell me about tours in Aït Bouguemez?" Output: "Aït Bouguemez, often referred to as the 'Happy Valley', is a stunning region in the Atlas Mountains. Known for its traditional Berber villages and scenic beauty, it's an excellent destination for hiking, cultural exploration, and experiencing Morocco's rural life. You can take guided tours to explore the valley's natural beauty, historical sites, and engage with the local community."

Limitations

  • The model may occasionally generate inaccurate or overly generalized information, especially for less common tourism-related queries.
  • It may not always be up-to-date with the most current events, news, or specific details.
  • The model is biased toward Moroccan culture and tourism, so it may not handle non-tourism-related queries well.

Ethical considerations

  • The model was trained using publicly available data and was fine-tuned with a focus on promoting positive cultural exchange and providing accurate, respectful insights into Moroccan tourism.
  • It is important to consider the potential biases and limitations of the model when using it, especially for users who may be unfamiliar with the cultural context.

Citation

If you use this model, please cite it as follows:

@misc{ouiam2025llama,
  author = {Ouiam},
  title = {Llama-2-7b-chat-finetune-tourism},
  year = {2025},
  url = {https://huggingface.co/Ouiam123/Llama-2-7b-chat-finetune-tourism},
}

License

This model is available under the MIT License.

Acknowledgements

We thank Hugging Face for providing the tools and infrastructure to fine-tune and share models on their platform. Special thanks to the contributors of the Nouhayla-CH/moroccantourismeds-llama2-formatted dataset for their effort in curating and sharing high-quality data for this fine-tuning task.

Downloads last month
114
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for Ouiam123/Llama-2-7b-chat-finetune-tourism

Finetuned
(100)
this model

Dataset used to train Ouiam123/Llama-2-7b-chat-finetune-tourism

Spaces using Ouiam123/Llama-2-7b-chat-finetune-tourism 3