--- library_name: peft base_model: aubmindlab/bert-base-arabertv02 tags: - base_model:adapter:aubmindlab/bert-base-arabertv02 - lora - transformers metrics: - accuracy model-index: - name: bert-eou-classifier_teacher results: [] --- # bert-eou-classifier_teacher This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.1555 - Accuracy: 0.791 - Auc: 0.865 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Auc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:-----:| | 0.5875 | 1.0 | 622 | 0.4866 | 0.75 | 0.845 | | 0.4703 | 2.0 | 1244 | 0.5337 | 0.76 | 0.855 | | 0.4097 | 3.0 | 1866 | 0.5273 | 0.785 | 0.869 | | 0.3598 | 4.0 | 2488 | 0.5383 | 0.795 | 0.868 | | 0.3278 | 5.0 | 3110 | 0.6127 | 0.803 | 0.878 | | 0.3019 | 6.0 | 3732 | 0.6487 | 0.804 | 0.878 | | 0.2616 | 7.0 | 4354 | 0.7659 | 0.801 | 0.874 | | 0.2451 | 8.0 | 4976 | 0.8012 | 0.793 | 0.871 | | 0.2241 | 9.0 | 5598 | 0.8936 | 0.802 | 0.87 | | 0.2044 | 10.0 | 6220 | 0.9513 | 0.8 | 0.869 | | 0.2015 | 11.0 | 6842 | 0.9689 | 0.802 | 0.869 | | 0.1834 | 12.0 | 7464 | 0.9756 | 0.799 | 0.869 | | 0.1731 | 13.0 | 8086 | 0.9917 | 0.796 | 0.866 | | 0.1455 | 14.0 | 8708 | 1.0958 | 0.794 | 0.863 | | 0.1557 | 15.0 | 9330 | 1.0042 | 0.796 | 0.869 | | 0.1316 | 16.0 | 9952 | 1.0996 | 0.796 | 0.865 | | 0.1335 | 17.0 | 10574 | 1.2024 | 0.794 | 0.863 | | 0.1201 | 18.0 | 11196 | 1.1508 | 0.791 | 0.865 | | 0.1204 | 19.0 | 11818 | 1.1580 | 0.798 | 0.865 | | 0.1137 | 20.0 | 12440 | 1.1555 | 0.791 | 0.865 | ### Framework versions - PEFT 0.18.0 - Transformers 4.57.3 - Pytorch 2.5.1+cu124 - Datasets 3.2.0 - Tokenizers 0.22.1