AtomicGPT-gemma2-9B
AtomicGPT is a large language model (LLM) specialized in the nuclear field, based on the gemma2-9B model. It deeply understands various nuclear technologies, theories, and terminology, including reactor design, radiation shielding, the nuclear fuel cycle, and nuclear safety and regulations. With this expertise, AtomicGPT delivers precise answers to technical and specialized questions in the nuclear domain.

How to Use

import os
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM

model_id = 'KAERI-MLP/gemma2-Korean-AtomicGPT-9B'

tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
    model_id,
    device_map="auto",
)

model.eval()

input_text = "Query about Nuclear(Atomic Energy)"
input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")

outputs = model.generate(**input_ids, max_new_tokens=1024)
print(tokenizer.decode(outputs[0]))
Downloads last month
140
Safetensors
Model size
9.24B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for KAERI-MLP/gemma2-Korean-AtomicGPT-9B

Base model

google/gemma-2-9b
Finetuned
(228)
this model