--- library_name: transformers license: apache-2.0 base_model: google/long-t5-tglobal-xl tags: - generated_from_trainer metrics: - bleu model-index: - name: 6431b2ccc01c99fe4ecf26fb75516f3e results: [] --- # 6431b2ccc01c99fe4ecf26fb75516f3e This model is a fine-tuned version of [google/long-t5-tglobal-xl](https://huggingface.co/google/long-t5-tglobal-xl) on the Helsinki-NLP/opus_books [de-it] dataset. It achieves the following results on the evaluation set: - Loss: 1.4337 - Data Size: 1.0 - Epoch Runtime: 378.7898 - Bleu: 6.6781 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - num_devices: 4 - total_train_batch_size: 32 - total_eval_batch_size: 32 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: constant - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Bleu | |:-------------:|:-----:|:-----:|:---------------:|:---------:|:-------------:|:------:| | No log | 0 | 0 | 3.4100 | 0 | 28.4126 | 0.2439 | | No log | 1 | 684 | 2.4432 | 0.0078 | 31.9325 | 1.5877 | | No log | 2 | 1368 | 2.3153 | 0.0156 | 37.8507 | 1.9091 | | No log | 3 | 2052 | 2.2078 | 0.0312 | 47.4179 | 3.6468 | | No log | 4 | 2736 | 2.0897 | 0.0625 | 59.4032 | 2.4129 | | 2.4659 | 5 | 3420 | 1.9834 | 0.125 | 79.1582 | 2.4795 | | 2.2792 | 6 | 4104 | 1.8620 | 0.25 | 120.1279 | 3.0764 | | 2.0256 | 7 | 4788 | 1.7290 | 0.5 | 205.3425 | 3.7368 | | 1.8058 | 8.0 | 5472 | 1.5855 | 1.0 | 376.6263 | 4.5428 | | 1.6135 | 9.0 | 6156 | 1.4931 | 1.0 | 370.8074 | 5.1258 | | 1.4937 | 10.0 | 6840 | 1.4448 | 1.0 | 371.5985 | 5.4812 | | 1.3742 | 11.0 | 7524 | 1.4114 | 1.0 | 371.6593 | 5.7704 | | 1.2846 | 12.0 | 8208 | 1.3902 | 1.0 | 370.8356 | 6.0013 | | 1.1784 | 13.0 | 8892 | 1.3771 | 1.0 | 373.1421 | 6.2228 | | 1.123 | 14.0 | 9576 | 1.3724 | 1.0 | 376.4735 | 6.3688 | | 1.0134 | 15.0 | 10260 | 1.3827 | 1.0 | 377.0905 | 6.4729 | | 0.9525 | 16.0 | 10944 | 1.3917 | 1.0 | 377.7570 | 6.6213 | | 0.8781 | 17.0 | 11628 | 1.4171 | 1.0 | 379.7279 | 6.6494 | | 0.8119 | 18.0 | 12312 | 1.4337 | 1.0 | 378.7898 | 6.6781 | ### Framework versions - Transformers 4.57.0 - Pytorch 2.8.0+cu128 - Datasets 4.2.0 - Tokenizers 0.22.1