roberta-large-ner-ghtk-cs-add-2label-new-data-3090-11Sep-1
This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3034
- Tk: {'precision': 0.8552631578947368, 'recall': 0.5603448275862069, 'f1': 0.6770833333333334, 'number': 116}
- A: {'precision': 0.9509345794392523, 'recall': 0.9443155452436195, 'f1': 0.9476135040745051, 'number': 431}
- Gày: {'precision': 0.7560975609756098, 'recall': 0.9117647058823529, 'f1': 0.8266666666666665, 'number': 34}
- Gày trừu tượng: {'precision': 0.9087221095334685, 'recall': 0.9180327868852459, 'f1': 0.9133537206931702, 'number': 488}
- Iờ: {'precision': 0.6585365853658537, 'recall': 0.7105263157894737, 'f1': 0.6835443037974684, 'number': 38}
- Ã đơn: {'precision': 0.8712871287128713, 'recall': 0.8669950738916257, 'f1': 0.8691358024691358, 'number': 203}
- Đt: {'precision': 0.9171038824763903, 'recall': 0.9954441913439636, 'f1': 0.9546695794647734, 'number': 878}
- Đt trừu tượng: {'precision': 0.8438818565400844, 'recall': 0.8583690987124464, 'f1': 0.8510638297872342, 'number': 233}
- Overall Precision: 0.9017
- Overall Recall: 0.9203
- Overall F1: 0.9109
- Overall Accuracy: 0.9635
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Tk | A | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 1.0 | 487 | 0.2055 | {'precision': 0.7586206896551724, 'recall': 0.5689655172413793, 'f1': 0.6502463054187193, 'number': 116} | {'precision': 0.9370629370629371, 'recall': 0.9327146171693735, 'f1': 0.9348837209302325, 'number': 431} | {'precision': 0.7878787878787878, 'recall': 0.7647058823529411, 'f1': 0.7761194029850745, 'number': 34} | {'precision': 0.8915187376725838, 'recall': 0.9262295081967213, 'f1': 0.9085427135678392, 'number': 488} | {'precision': 0.8, 'recall': 0.21052631578947367, 'f1': 0.3333333333333333, 'number': 38} | {'precision': 0.8909090909090909, 'recall': 0.7241379310344828, 'f1': 0.7989130434782608, 'number': 203} | {'precision': 0.9588431590656284, 'recall': 0.9817767653758542, 'f1': 0.9701744513224535, 'number': 878} | {'precision': 0.84375, 'recall': 0.8111587982832618, 'f1': 0.8271334792122538, 'number': 233} | 0.9142 | 0.8889 | 0.9014 | 0.9575 |
| 0.087 | 2.0 | 974 | 0.1863 | {'precision': 0.872093023255814, 'recall': 0.646551724137931, 'f1': 0.7425742574257427, 'number': 116} | {'precision': 0.9846153846153847, 'recall': 0.8909512761020881, 'f1': 0.9354445797807551, 'number': 431} | {'precision': 0.7073170731707317, 'recall': 0.8529411764705882, 'f1': 0.7733333333333334, 'number': 34} | {'precision': 0.9420935412026726, 'recall': 0.8668032786885246, 'f1': 0.9028815368196371, 'number': 488} | {'precision': 0.8181818181818182, 'recall': 0.47368421052631576, 'f1': 0.6, 'number': 38} | {'precision': 0.8317757009345794, 'recall': 0.8768472906403941, 'f1': 0.8537170263788968, 'number': 203} | {'precision': 0.9311827956989247, 'recall': 0.9863325740318907, 'f1': 0.9579646017699115, 'number': 878} | {'precision': 0.8611111111111112, 'recall': 0.7982832618025751, 'f1': 0.8285077951002228, 'number': 233} | 0.9195 | 0.8918 | 0.9054 | 0.9605 |
| 0.0437 | 3.0 | 1461 | 0.1873 | {'precision': 0.8279569892473119, 'recall': 0.6637931034482759, 'f1': 0.736842105263158, 'number': 116} | {'precision': 0.9315673289183223, 'recall': 0.9791183294663574, 'f1': 0.9547511312217195, 'number': 431} | {'precision': 0.725, 'recall': 0.8529411764705882, 'f1': 0.7837837837837837, 'number': 34} | {'precision': 0.9072164948453608, 'recall': 0.9016393442622951, 'f1': 0.9044193216855086, 'number': 488} | {'precision': 0.6458333333333334, 'recall': 0.8157894736842105, 'f1': 0.7209302325581395, 'number': 38} | {'precision': 0.8557213930348259, 'recall': 0.8472906403940886, 'f1': 0.8514851485148515, 'number': 203} | {'precision': 0.9320388349514563, 'recall': 0.9840546697038725, 'f1': 0.9573407202216068, 'number': 878} | {'precision': 0.8739495798319328, 'recall': 0.8927038626609443, 'f1': 0.8832271762208068, 'number': 233} | 0.9026 | 0.9265 | 0.9144 | 0.9628 |
| 0.0346 | 4.0 | 1948 | 0.2300 | {'precision': 0.7792207792207793, 'recall': 0.5172413793103449, 'f1': 0.6217616580310881, 'number': 116} | {'precision': 0.9601873536299765, 'recall': 0.951276102088167, 'f1': 0.9557109557109557, 'number': 431} | {'precision': 0.7837837837837838, 'recall': 0.8529411764705882, 'f1': 0.8169014084507041, 'number': 34} | {'precision': 0.8840864440078585, 'recall': 0.9221311475409836, 'f1': 0.9027081243731194, 'number': 488} | {'precision': 0.6296296296296297, 'recall': 0.8947368421052632, 'f1': 0.7391304347826088, 'number': 38} | {'precision': 0.8704663212435233, 'recall': 0.8275862068965517, 'f1': 0.8484848484848485, 'number': 203} | {'precision': 0.9216101694915254, 'recall': 0.9908883826879271, 'f1': 0.9549945115257958, 'number': 878} | {'precision': 0.8809523809523809, 'recall': 0.7939914163090128, 'f1': 0.835214446952596, 'number': 233} | 0.9000 | 0.9112 | 0.9056 | 0.9603 |
| 0.0266 | 5.0 | 2435 | 0.2330 | {'precision': 0.788235294117647, 'recall': 0.5775862068965517, 'f1': 0.6666666666666666, 'number': 116} | {'precision': 0.9419642857142857, 'recall': 0.9791183294663574, 'f1': 0.9601820250284414, 'number': 431} | {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} | {'precision': 0.8986083499005965, 'recall': 0.9262295081967213, 'f1': 0.9122098890010091, 'number': 488} | {'precision': 0.5961538461538461, 'recall': 0.8157894736842105, 'f1': 0.6888888888888889, 'number': 38} | {'precision': 0.7510204081632653, 'recall': 0.9064039408866995, 'f1': 0.8214285714285714, 'number': 203} | {'precision': 0.9180672268907563, 'recall': 0.9954441913439636, 'f1': 0.955191256830601, 'number': 878} | {'precision': 0.7709923664122137, 'recall': 0.8669527896995708, 'f1': 0.8161616161616161, 'number': 233} | 0.8737 | 0.9347 | 0.9032 | 0.9580 |
| 0.0193 | 6.0 | 2922 | 0.2854 | {'precision': 0.7777777777777778, 'recall': 0.4224137931034483, 'f1': 0.547486033519553, 'number': 116} | {'precision': 0.9430523917995444, 'recall': 0.9605568445475638, 'f1': 0.9517241379310345, 'number': 431} | {'precision': 0.7560975609756098, 'recall': 0.9117647058823529, 'f1': 0.8266666666666665, 'number': 34} | {'precision': 0.8853754940711462, 'recall': 0.9180327868852459, 'f1': 0.9014084507042255, 'number': 488} | {'precision': 0.5777777777777777, 'recall': 0.6842105263157895, 'f1': 0.6265060240963854, 'number': 38} | {'precision': 0.8936170212765957, 'recall': 0.8275862068965517, 'f1': 0.8593350383631713, 'number': 203} | {'precision': 0.9023883696780893, 'recall': 0.989749430523918, 'f1': 0.9440521455730582, 'number': 878} | {'precision': 0.8290598290598291, 'recall': 0.8326180257510729, 'f1': 0.8308351177730192, 'number': 233} | 0.8871 | 0.9083 | 0.8976 | 0.9595 |
| 0.0138 | 7.0 | 3409 | 0.2634 | {'precision': 0.7380952380952381, 'recall': 0.5344827586206896, 'f1': 0.62, 'number': 116} | {'precision': 0.9578454332552693, 'recall': 0.9489559164733179, 'f1': 0.9533799533799533, 'number': 431} | {'precision': 0.7380952380952381, 'recall': 0.9117647058823529, 'f1': 0.8157894736842106, 'number': 34} | {'precision': 0.9098360655737705, 'recall': 0.9098360655737705, 'f1': 0.9098360655737705, 'number': 488} | {'precision': 0.6444444444444445, 'recall': 0.7631578947368421, 'f1': 0.6987951807228916, 'number': 38} | {'precision': 0.8613861386138614, 'recall': 0.8571428571428571, 'f1': 0.8592592592592593, 'number': 203} | {'precision': 0.9202975557917109, 'recall': 0.9863325740318907, 'f1': 0.9521715228147334, 'number': 878} | {'precision': 0.8648648648648649, 'recall': 0.8240343347639485, 'f1': 0.8439560439560441, 'number': 233} | 0.9004 | 0.9116 | 0.9060 | 0.9623 |
| 0.0084 | 8.0 | 3896 | 0.2786 | {'precision': 0.8043478260869565, 'recall': 0.6379310344827587, 'f1': 0.7115384615384616, 'number': 116} | {'precision': 0.9618138424821002, 'recall': 0.9350348027842227, 'f1': 0.9482352941176471, 'number': 431} | {'precision': 0.7380952380952381, 'recall': 0.9117647058823529, 'f1': 0.8157894736842106, 'number': 34} | {'precision': 0.9056224899598394, 'recall': 0.9241803278688525, 'f1': 0.9148073022312374, 'number': 488} | {'precision': 0.627906976744186, 'recall': 0.7105263157894737, 'f1': 0.6666666666666666, 'number': 38} | {'precision': 0.8088888888888889, 'recall': 0.896551724137931, 'f1': 0.8504672897196263, 'number': 203} | {'precision': 0.9209694415173867, 'recall': 0.9954441913439636, 'f1': 0.9567597153804049, 'number': 878} | {'precision': 0.8954545454545455, 'recall': 0.8454935622317596, 'f1': 0.869757174392936, 'number': 233} | 0.8999 | 0.9248 | 0.9122 | 0.9617 |
| 0.004 | 9.0 | 4383 | 0.3020 | {'precision': 0.8571428571428571, 'recall': 0.5172413793103449, 'f1': 0.6451612903225806, 'number': 116} | {'precision': 0.9551886792452831, 'recall': 0.9396751740139211, 'f1': 0.9473684210526315, 'number': 431} | {'precision': 0.7560975609756098, 'recall': 0.9117647058823529, 'f1': 0.8266666666666665, 'number': 34} | {'precision': 0.9190871369294605, 'recall': 0.9077868852459017, 'f1': 0.9134020618556701, 'number': 488} | {'precision': 0.65, 'recall': 0.6842105263157895, 'f1': 0.6666666666666667, 'number': 38} | {'precision': 0.8911917098445595, 'recall': 0.8472906403940886, 'f1': 0.8686868686868686, 'number': 203} | {'precision': 0.9171038824763903, 'recall': 0.9954441913439636, 'f1': 0.9546695794647734, 'number': 878} | {'precision': 0.8504273504273504, 'recall': 0.8540772532188842, 'f1': 0.8522483940042828, 'number': 233} | 0.9069 | 0.9128 | 0.9098 | 0.9630 |
| 0.0033 | 10.0 | 4870 | 0.3034 | {'precision': 0.8552631578947368, 'recall': 0.5603448275862069, 'f1': 0.6770833333333334, 'number': 116} | {'precision': 0.9509345794392523, 'recall': 0.9443155452436195, 'f1': 0.9476135040745051, 'number': 431} | {'precision': 0.7560975609756098, 'recall': 0.9117647058823529, 'f1': 0.8266666666666665, 'number': 34} | {'precision': 0.9087221095334685, 'recall': 0.9180327868852459, 'f1': 0.9133537206931702, 'number': 488} | {'precision': 0.6585365853658537, 'recall': 0.7105263157894737, 'f1': 0.6835443037974684, 'number': 38} | {'precision': 0.8712871287128713, 'recall': 0.8669950738916257, 'f1': 0.8691358024691358, 'number': 203} | {'precision': 0.9171038824763903, 'recall': 0.9954441913439636, 'f1': 0.9546695794647734, 'number': 878} | {'precision': 0.8438818565400844, 'recall': 0.8583690987124464, 'f1': 0.8510638297872342, 'number': 233} | 0.9017 | 0.9203 | 0.9109 | 0.9635 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support