Helsinki-NLP/kde4
Updated • 1.44k • 25
How to use haochenhe/lab1_random with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("haochenhe/lab1_random")
model = AutoModelForSeq2SeqLM.from_pretrained("haochenhe/lab1_random")This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on the kde4 dataset.
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Base model
Helsinki-NLP/opus-mt-en-fr