duongna's picture
Update README.md
1be2f0c
|
raw
history blame
1.01 kB
metadata
language:
  - vi
tags:
  - pytorch
  - causal-lm
  - gpt

GPT-Neo 1.3B on Vietnamese News

Details will be available soon.

For more information, please contact [email protected] (Dương) / [email protected] (Thành) / [email protected] (Bình).

How to use

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("VietAI/gpt-neo-1.3B-vietnamese-news")
model = AutoModelForCausalLM.from_pretrained("VietAI/gpt-neo-1.3B-vietnamese-news", low_cpu_mem_usage=True)

device = torch.device("cuda" if torch.cuda.is_available() else "cpu") 
model.to(device)

prompt = "Tiềm năng của trí tuệ nhân tạo" # your input sentence
input_ids = tokenizer(prompt, return_tensors="pt")['input_ids'].to(device)

gen_tokens = model.generate(
        input_ids,
        max_length=max_length,
        do_sample=True,
        temperature=0.9,
        top_k=20,
    )

gen_text = tokenizer.batch_decode(gen_tokens)[0]
print(gen_text)