ElanMT
This model is a pretrained checkpoint and is suitable for fine-tuning on a large dataset. For general use cases, using ElanMT-BT-ja-en is strongly recommended.
Model Details
This is a translation model based on Marian MT 6-layer encoder-decoder transformer architecture with sentencepiece tokenizer.
- Developed by: ELAN MITSUA Project / Abstract Engine
- Model type: Translation
- Source Language: Japanese
- Target Language: English
- License: CC BY-SA 4.0
Usage
Training Data
Training Procedure
Evaluation
Disclaimer
The translated result may be very incorrect, harmful or biased. The model was developed to investigate achievable performance with only a relatively small, licensed corpus, and is not suitable for use cases requiring high translation accuracy. Under Section 5 of the CC BY-SA 4.0 License, ELAN MITSUA Project / Abstract Engine is not responsible for any direct or indirect loss caused by the use of the model.
- Downloads last month
- 118
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.