Text Generation
Transformers
English
mistral
medical
code
custom_code
Codette / README.md
Raiff1982's picture
Update README.md
7c69885 verified
metadata
license: mit
language:
  - en
metrics:
  - bleurt
  - bleu
  - chrf
datasets:
  - Raiff1982/coredata
  - Raiff1982/pineco
base_model:
  - tiiuae/falcon-40b
  - mistralai/Mistral-7B-v0.3
tags:
  - medical
  - code
  - text-generation
  - transformers
pipeline_tag: text-generation
library_name: transformers

Codette - Falcon & Mistral Merged Model

🧠 Overview

Codette is an advanced AI assistant designed to support users across cognitive, creative, and analytical tasks.
This model merges Falcon-40B and Mistral-7B to deliver high performance in text generation, medical diagnostics, and code reasoning.


⚑ Features

  • βœ… Merges Falcon-40B & Mistral-7B for enhanced capabilities
  • βœ… Supports multi-modal text generation, medical analysis, and code synthesis
  • βœ… Fine-tuned on domain-specific datasets (Raiff1982/coredata, Raiff1982/pineco)
  • βœ… Optimized for research, enterprise AI, and advanced reasoning

πŸ“‚ Model Details

  • Base Models: Falcon-40B, Mistral-7B-v0.3
  • Architecture: Transformer-based language model
  • Use Cases: Text generation, code assistance, research, medical insights
  • Training Datasets:
    • Raiff1982/coredata: medical and reasoning-focused samples
    • Raiff1982/pineco: mixed domain creative + technical prompts

πŸ“– Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "Raiff1982/Codette"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True)

prompt = "How can AI improve medical diagnostics?"
inputs = tokenizer(prompt, return_tensors="pt") 
output = model.generate(**inputs, max_length=200)
print(tokenizer.decode(output[0], skip_special_tokens=True))